Technology Alert – PLG Joins Blockchain Initiative

Technology Alert – PLG Joins Blockchain Initiative

Patrick Law Group Becomes Member of Legal Center of Excellence, a New Global Platform for Studying Developments in Blockchain

Patrick Law Group is pleased to announce that the firm has joined a new global legal initiative, the Legal Center of Excellence (LCoE), which is devoted to advancing legal thought leadership and sharing best practices regarding blockchain technology. The LCoE was established by R3, a London, New York and Singapore-based enterprise software firm that is working with over 200 institutions to develop applications on its distributed ledger platform, Corda.

As a member of the LCoE, PLG will have access to R3’s research on blockchain and monthly demonstrations that will provide PLG attorneys insight into real world blockchain applications. Richard Gendal Brown, R3’s chief technology officer, said the LCoE “will allow R3 to directly engage with the lawyers that will be advising on and helping draft the smart contracts used by the network of Corda users across the globe.”

PLG’s participation in the LCoE demonstrates PLG’s commitment to staying at the forefront of legal and technological developments.  We look forward to working with our Clients on blockchain and other emerging technologies.

IoT Device Companies: The FTC is Monitoring Your COPPA Data Deletion Duties and More

IoT Device Companies: The FTC is Monitoring Your COPPA Data Deletion Duties and More

By Jennifer Thompson


See all of Our JDSupra Posts by Clicking the Badge Below

View Patrick Law Group, LLC

Recent Federal Trade Commission (FTC) activities with respect to the Children’s Online Privacy Protection Act (COPPA) demonstrate a continued interest in, and increased scrutiny of, companies subject to COPPA.  While the FTC has pursued companies for alleged violations of all facets of its COPPA Six Step Compliance Plan, most recently the FTC has focused on the obligation to promptly and securely delete all data collected if it is no longer needed.  Taken as a whole, recent FTC activity may indicate a desire on the part of the FTC to expand its regulatory reach.

First, as documented In “IoT Device Companies:  Add COPPA to Your “To Do” Lists,” the FTC issued guidance in June, 2017, that “Internet of Things” (“IoT”) companies selling devices used by children are subject to COPPA, and may face increased scrutiny from the FTC with respect to their data collection practices.  While COPPA was originally written to apply to online service providers and websites, this guidance made it clear that COPPA’s reach extends to device companies. In general, this action focused on step 1 of the Compliance Plan (general applicability of COPPA), while also providing some guidance on how companies could comply with step 4 of the Compliance Plan (obtaining verifiable parental consent).

Then, in January, 2018, the FTC entered its first-ever settlement with an internet-connected device company resulting from alleged violations of COPPA and the FTC Act.  As discussed in “IoT Device Companies: COPPA Lessons Learned from Vtech’s FTC Settlement,” the FTC alleged violations by the device company of almost all the steps in the Compliance Plan, including failure to appropriately post privacy policies (step 2), failure to appropriately notify parents of the intended data collection activities prior to data collection (step 3), failure to verify parental consent (step 4) and failure to implement adequate security measures to protect the data collected (step 6).  The significance of the settlement was that it solidified the earlier guidance that COPPA operates to govern device companies, in addition to websites and online application providers.

In April, 2018, the FTC further expanded its regulatory reach by sending warning letters alleging potential COPPA violations to two device/application companies located outside the United States.  Both companies collected precise geolocation data on children in connection with devices worn by the children.  The warning letters clarified that, although located outside the United States, the FTC deemed the companies subject to COPPA, as: a) their services were directed at children in the United States; and b) and the companies knowingly collected data from children in the United States.  Interestingly, one of the targeted companies, Tinitell, Inc., was not even selling its devices at the time of the letter’s issuance.  Nonetheless, the FTC warned that since the Tinitell website indicated that the devices would work through September 2018: a) COPPA would continue to apply beyond the sale of the devices; and b) the company is obligated to continue to take reasonable measures to secure the data it had and would continue to collect.

Most recently, the FTC again took to its blog post to remind companies that COPPA obligations pursuant to step 6 (implement reasonable procedures to protect the security of kids’ personal information) may extend even beyond the termination of the company’s relationship with the child.  Although “reasonable security measures” is a broad concept, the FTC narrowed in on the duty to delete data that is no longer required.

Section 312.10 of COPPA states that companies may keep personal information obtained from children under the age of 13 “for only as long as is reasonably necessary to fulfill the purpose for which the information was collected.”  After the fulfillment of the purpose for which the information was collected, the information is to be deleted in such a manner and using reasonable measures to ensure that it cannot be accessed or used in connection with the deletion.

On May 31, 2018, the FTC posted a blog entitled “Under COPPA, data deletion isn’t just a good idea.  It’s the law.” which reminds website and online service providers subject to COPPA (and, by extension, any device companies that market internet-connected devices to children) that there are situations in which COPPA will require subject companies to delete the personal information it has collected from children, even if the parent does not specifically request the deletion.  This guidance establishes an affirmative duty on the company collecting the information to self-police and to securely discard the information as soon as it no longer needs it, even if the customer has not made such a request.

The blog further suggests that all companies review their data retention policies to ensure that the stated policies adequately address the following list of questions:

  • What types of personal information are you collecting from children?
  • What is your stated purpose for collecting the information?
  • How long do you need to hold on to the information to fulfill the purpose for which it was initially collected? For example, do you still need information you collected a year ago?
  • Does the purpose for using the information end with an account deletion, subscription cancellation, or account inactivity?
  • When it’s time to delete information, are you doing it securely?

It will be interesting to see if the FTC continues to focus on COPPA in its enforcement actions. All told, the FTC has brought around thirty actions pursuant to COPPA.  But recent activity, like the warning letters to international companies and the recent guidance on data deletion, indicate that the FTC may be expanding the arena for COPPA applicability.

OTHER THOUGHT LEADERSHIP POSTS:

IoT Device Companies: The FTC is Monitoring Your COPPA Data Deletion Duties and More

By Jennifer Thompson See all of Our JDSupra Posts by Clicking the Badge Below Recent Federal Trade Commission (FTC) activities with respect to the Children’s Online Privacy Protection Act (COPPA) demonstrate a continued interest in, and increased scrutiny of,...

Predictive Algorithms in Sentencing: Are We Automating Bias?

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below Although algorithms are often presumed to be objective and unbiased, recent investigations into algorithms used in the criminal justice system to predict recidivism have produced compelling...

My Car Made Me Do It: Tales from a Telematics Trial

By Dawn Ingley See all of Our JDSupra Posts by Clicking the Badge Below Recently, my automobile insurance company gauged my interest in saving up to 20% on insurance premiums.  The catch?  For three months, I would be required to install a plug-in monitor that...

When Data Scraping and the Computer Fraud and Abuse Act Collide

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below As the volume of data available on the internet continues to increase at an extraordinary pace, it is no surprise that many companies are eager to harvest publicly available data for their own use...

Is Your Bug Bounty Program Uber Risky?

By Jennifer Thompson See all of Our JDSupra Posts by Clicking the Badge Below In October 2016, Uber discovered that the personal contact information of some 57 million Uber customers and drivers, as well as the driver’s license numbers of over 600,000 United States...

IoT Device Companies: COPPA Lessons Learned from VTech’s FTC Settlement

By Jennifer Thompson See all of Our JDSupra Posts by Clicking the Badge Below In “IoT Device Companies:  Add COPPA to Your "To Do" Lists,” I summarized the Federal Trade Commission (FTC)’s June, 2017 guidance that IoT companies selling devices used by children will be...

Beware of the Man-in-the-Middle: Lessons from the FTC’s Lenovo Settlement

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below The Federal Trade Commission’s recent approval of a final settlement with Lenovo (United States) Inc., one of the world’s largest computer manufacturers, offers a reminder that when it comes to...

#TheFTCisWatchingYou: Influencers, Hashtags and Disclosures 2017 Year End Review

Influencer marketing, hashtags and proper disclosures were the hot button topic for the Federal Trade Commission (the “FTC”) in 2017, so let’s take a look at just how the FTC has influenced Social Media Influencer Marketing in 2017. First, following up on the more...

Part III of III | FTC Provides Guidance on Reasonable Data Security Practices

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below This is the third in a series of three articles on the FTC’s Stick with Security blog. Part I and Part II of this series can be found here and here. Over the past 15 years, the Federal Trade...

Apple’s X-Cellent Response to Sen. Franken’s Queries Regarding Facial Recognition Technologies

By Dawn Ingley See all of Our JDSupra Posts by Clicking the Badge Below Recently, I wrote an article outlining the growing body of state legislation designed to address and mitigate emerging privacy concerns over facial recognition technologies.  It now appears that...

Predictive Algorithms in Sentencing: Are We Automating Bias?

Predictive Algorithms in Sentencing: Are We Automating Bias?

By Linda Henry


See all of Our JDSupra Posts by Clicking the Badge Below

View Patrick Law Group, LLC

Although algorithms are often presumed to be objective and unbiased, recent investigations into algorithms used in the criminal justice system to predict recidivism have produced compelling evidence that such algorithms may be racially biased.  As a result of one such investigation by ProPublica, the New York City Council recently passed the first bill in the country designed to address algorithmic discrimination in government agencies. The goal of New York City’s algorithmic accountability bill is to monitor algorithms used by municipal agencies and provide recommendations as to how to make the City’s algorithms fairer and more transparent.

The criminal justice system is one area in which governments are increasingly using algorithms, particularly in connection with creating risk assessment profiles of defendants.  For example, COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) is a computer algorithm used to score a defendant’s risk of recidivism and is one of the risk assessment tools most widely used by courts to predict recidivism.  COMPAS creates a risk assessment by comparing information regarding a defendant to historical data from groups of similar individuals.

COMPAS is only one example of proprietary software being used by courts to make sentencing decisions, and states are increasingly using software risk assessment tools such as COMPAS as a formal part of the sentencing process.  Because many of the algorithms such as COMPAS are proprietary, the source code is not published and is not subject to state or federal open record laws. As a result, the opacity inherent in proprietary programs such as COMPAS prevents third parties from seeing the data and calculations that impact sentencing decisions.

Challenges by defendants to the use of such algorithms in criminal sentencing have been unsuccessful.  In 2017, Eric Loomis, a Wisconsin defendant, unsuccessfully challenged the use of the COMPAS algorithm as a violation of his due process rights.  In 2013, Loomis was arrested and charged with five criminal counts related to a drive by shooting.  Loomis maintained that he was not involved in the shooting but pled guilty to driving a motor vehicle without the owner’s permission and fleeing from police.  At sentencing, the trial court judge sentenced Loomis to six years in prison, noting that the court ruled out probation based in part on the COMPAS risk assessment that suggested Loomis presented a high risk to re-offend.[1]Loomis appealed his sentence, arguing that the use of the risk assessment violated his constitutional right to due process.  The Wisconsin Supreme Court ultimately affirmed the lower court’s decision that it could utilize the risk assessment tool in sentencing, and also found no violation of Loomis’ due process rights.  In 2017, the U.S. Supreme Court denied Loomis’ petition for writ of certiorari.

The use of computer algorithms in risk assessments have been touted by some as a way to eliminate human bias in sentencing.  Although COMPAS and other risk assessment software programs use algorithms that are race neutral on their face, the algorithms frequently use data points that can serve as proxies for race, such as ZIP codes, education history and family history of incarceration.[2]  In addition, critics of such algorithms question the methodologies used by programs such as COMPAS, since methodologies (which are necessarily created by individuals) may unintentionally reflect human bias.  If the data sets being used to train the algorithms are not truly objective, human bias may be unintentionally baked into the algorithm, effectively automating human bias.

The investigation by ProPublica that prompted New York City’s algorithmic accountability bill found that COMPAS risk assessments were more likely to erroneously identify black defendants as presenting a high risk for recidivism at almost twice the rate as white defendants (43 percent vs 23 percent).  In addition, ProPublica’s research revealed that COMPAS risk assessments erroneously labeled white defendants as low-risk 48 percent of the time, compared to 28 percent for black defendants.  Black defendants were also 45 percent more likely to receive a higher risk score than white defendants, even after controlling for variables such as prior crimes, age and gender. [3]ProPublica’s findings raise serious concerns regarding COMPAS, however, because the calculations used to assess risk are proprietary, neither defendants nor the court systems utilizing COMPAS have visibility into why such assessments have significant rates of mislabeling among black and white defendants.

Although New York City’s algorithmic accountability bill hopes to curb algorithmic bias and bring more transparency to algorithms used across all New York City agencies, including those used in criminal sentencing, the task force faces significant hurdles.  It is unclear how the task force will make the threshold determination as to whether an algorithm disproportionately harms a particular group, or how the City will increase transparency and fairness without access to proprietary source code.  Despite the task force’s daunting task of balancing the need for more transparency against the right of companies to protect their intellectual property, critics of the use of algorithms in the criminal justice system are hopeful that New York City’s bill will encourage other cities and states to acknowledge the problem of algorithmic bias.

 


[1] State v. Loomis, 881 N.W.2d 749 (2016)

[2] Hudson, L., Technology Is Biased Too. How Do We Fix It?, FiveThirtyEight (Jul. 20, 2017), https://fivethirtyeight.com/features/technology-is-biased-too-how-do-we-fix-it/.

[3] Julia Angwin et al., Machine Bias, ProPublica (May 23, 2016), https://www .propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing.

OTHER THOUGHT LEADERSHIP POSTS:

IoT Device Companies: The FTC is Monitoring Your COPPA Data Deletion Duties and More

By Jennifer Thompson See all of Our JDSupra Posts by Clicking the Badge Below Recent Federal Trade Commission (FTC) activities with respect to the Children’s Online Privacy Protection Act (COPPA) demonstrate a continued interest in, and increased scrutiny of,...

Predictive Algorithms in Sentencing: Are We Automating Bias?

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below Although algorithms are often presumed to be objective and unbiased, recent investigations into algorithms used in the criminal justice system to predict recidivism have produced compelling...

My Car Made Me Do It: Tales from a Telematics Trial

By Dawn Ingley See all of Our JDSupra Posts by Clicking the Badge Below Recently, my automobile insurance company gauged my interest in saving up to 20% on insurance premiums.  The catch?  For three months, I would be required to install a plug-in monitor that...

When Data Scraping and the Computer Fraud and Abuse Act Collide

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below As the volume of data available on the internet continues to increase at an extraordinary pace, it is no surprise that many companies are eager to harvest publicly available data for their own use...

Is Your Bug Bounty Program Uber Risky?

By Jennifer Thompson See all of Our JDSupra Posts by Clicking the Badge Below In October 2016, Uber discovered that the personal contact information of some 57 million Uber customers and drivers, as well as the driver’s license numbers of over 600,000 United States...

IoT Device Companies: COPPA Lessons Learned from VTech’s FTC Settlement

By Jennifer Thompson See all of Our JDSupra Posts by Clicking the Badge Below In “IoT Device Companies:  Add COPPA to Your "To Do" Lists,” I summarized the Federal Trade Commission (FTC)’s June, 2017 guidance that IoT companies selling devices used by children will be...

Beware of the Man-in-the-Middle: Lessons from the FTC’s Lenovo Settlement

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below The Federal Trade Commission’s recent approval of a final settlement with Lenovo (United States) Inc., one of the world’s largest computer manufacturers, offers a reminder that when it comes to...

#TheFTCisWatchingYou: Influencers, Hashtags and Disclosures 2017 Year End Review

Influencer marketing, hashtags and proper disclosures were the hot button topic for the Federal Trade Commission (the “FTC”) in 2017, so let’s take a look at just how the FTC has influenced Social Media Influencer Marketing in 2017. First, following up on the more...

Part III of III | FTC Provides Guidance on Reasonable Data Security Practices

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below This is the third in a series of three articles on the FTC’s Stick with Security blog. Part I and Part II of this series can be found here and here. Over the past 15 years, the Federal Trade...

Apple’s X-Cellent Response to Sen. Franken’s Queries Regarding Facial Recognition Technologies

By Dawn Ingley See all of Our JDSupra Posts by Clicking the Badge Below Recently, I wrote an article outlining the growing body of state legislation designed to address and mitigate emerging privacy concerns over facial recognition technologies.  It now appears that...

Linda Henry | Achieva Graduate

Linda Henry | Achieva Graduate

Congratulations to Deputy Counsel and VP of Operations, Linda Henry, for her graduation from Achieva®, a year-long program with Pathbuilders. It is a program focused on helping women become high-impact leaders.  The program included monthly educational workshops, peer networking and one-on-one mentoring. https://www.pathbuilders.com/pathbuilders-achieva-packs-years-into-hours/

Lizz Patrick serves as a mentor for the Pathbuilders organization and believes in its mission wholeheartedly. It goes without saying that Lizz is extremely proud of Linda for having reached the Achieva® level in the Pathbuilders Program Series!

Pathbuilders Achieva Mentors

11/20/17 #Achieva #mentees & #mentors heard from an executive panel on strategic communicatons. Lizz Patrick moderated our panel of Cathy Adams, Mike Page, Alicia Thompson, and Lynne Zappone.

My Car Made Me Do It: Tales from a Telematics Trial

My Car Made Me Do It: Tales from a Telematics Trial

By Dawn Ingley


See all of Our JDSupra Posts by Clicking the Badge Below

View Patrick Law Group, LLC

Recently, my automobile insurance company gauged my interest in saving up to 20% on insurance premiums.  The catch?  For three months, I would be required to install a plug-in monitor that collected extensive metadata—average speeds and distances, routes routinely traveled, seat belt usage and other types of data.  But to what end?  Was the purpose of the monitor to learn more about my driving practices and to encourage better driving habits?  To share my data with advertisers wishing to serve up a buy-one, get-one free coupon for paper towels from my favorite grocery store (just as I pass by it) on my touchscreen dashboard?  Or to build a “risk profile” that could be sold to parties (AirBnB, banks, other insurance companies) who may have a vested interest in learning more about my propensity for making good decisions?  The answer could be, “all of the above.”

Wireless technology integrated into vehicles is nothing new—indeed, diagnostic systems, cellular connections and in-dash navigation have been the norm for years.  However, the breadth of data collection and manner in which data is monetized are evolving quickly.  Telematics actually are very akin to a social media platform in terms of the sheer volumes of data collected and the purposes for which data can (and could) be used.  To be sure, many use cases stand to benefit drivers—predicting oil changes and locating the nearest gas stations–for example.  Future functionality may even detect texting while driving or sleeping drivers.

Opportunities abound, and at least one company’s early success is proof of the sheer potential of telematics data mining.  In the past three years, Otonomo has carved out a niche for itself by brokering sales of mined telematic data to parties such as insurance companies and general retail businesses.  Otonomo’s technology efficiently packages telematics data into a user-friendly, anonymized platform that takes into account worldwide regulations governing telematics.

Not surprisingly, several automobile manufacturers predict the sale of automobile analytics as a key profit center in coming years.  What is surprising, however, are the lack of legal “rules of the road” that exist today in the United States.  While laws do clarify that an automobile’s event data recorders are owned by the automobile owner (and provide that these “black boxes” may be obtained only by court order), other laws governing telematics are few and far between.  A driver’s consent often occurs upon registration of embedded GPS platforms or other navigation tools, but according to Government Accounting Office research, these types of notices often are lacking in terms of explaining how data is used and whether it is shared.  The Federal Trade Commission maintains jurisdiction over consumer data and related privacy issues, but there are not yet rules specific to telematic data collected by the automobile industry.

Much like the credit card industries’ promulgation of Payment Card Industr Data Security Standard (PCI DSS) rules, the automotive industry, in 2014, responded with its own Privacy Principles for Vehicle Technologies and Services, which include the following:

  • Transparency: a commitment to provide both owners and registered users of vehicles with access to “clear, meaningful notices” as to what data is collected, used and shared.
  • Choice: a commitment to provide owners and registered users with certain choices “regarding the collection, use and sharing” of information.
  • Respect for Context: a commitment to use and share information in a manner consistent with the context in which information was collected.
  • Data Minimization, De-identification, and Retention: a commitment to collect information only as needed for legitimate business purposes, and to retain it no longer than needed for such legitimate business purposes.
  • Data Security: a commitment to implement reasonable measures to protect information against loss and unauthorized access or use.
  • Integrity and Access: a commitment to implement measures to maintain the accuracy of information, along with a means for owners and registered users to correct information.
  • Accountability: a commitment to take reasonable steps to ensure that any parties receiving the information adhere to the principles.

To date, twenty automakers have signed on to the principles, including Honda, Toyota, Nissan, Subaru and Hyundai.

Congress has also responded to concerns over privacy and security in automobiles.  In early 2017, Representatives Joe Wilson (R-SC, 2nd District) and Ted Lieu (D-CA, 33rd District) introduced the SPY Car Study Act.  The Act does not introduce any new laws or regulations, but does require the National Highway Traffic Safety Administration (NHTSA) to investigate technological threats to automobiles.  More specifically, Congress tasked the NHTSA with identifying:

  • Measures necessary to separate critical software systems that affect a driver’s control of a vehicle from other technology systems;
  • Measures necessary to detect and prevent codes associated with malicious behaviors;
  • Techniques necessary to detect and prevent, discourage or mitigate intrusions into vehicle software systems and other cybersecurity risks in automobiles;
  • Best practices to secure driver data collected by electronic systems; and
  • A timeline for implementation of technology to reflect such best practices.

Otonomo has indicated that the current market for automobile telematics data focuses on user experience and convenience, but, in reality, no future use case is off the table.  And as with many technologies and, in particular, IoT platforms, drivers must weigh the benefits and dangers of use.  The calculus would look something like this:

  • Benefits (current and future):
    • Traffic and navigation services save drivers time and reduce risk of further traffic accidents;
    • Automobile diagnostics can not only remind drivers of to-do’s such as oil changes, but also alert drivers to issues such as dangerous behaviors (texting or sleeping while driving, blood alcohol level); and
    • Automobile insurance discounts may be a “reward” for drivers supplying metadata.
  • Risks:
    • Customer “lock-in”—could data as to driving habits (miles driven, speeds, use of turn signals) keep a customer from changing insurance carriers, if prospective carriers refuse coverage based on a driver’s metrics?
    • Will lenders factor risky driving behaviors into decisions as to whether credit or loans are extended?
    • Will current insurers raise premiums based on activities tracked via collection of metadata?

Indeed, the answer to this calculus may vary across geographies and cultures.  In the United States, there is not an across-the-board approach to privacy and data protection; rather, protections are extended across particular industries (ex: HIPAA for healthcare data).  U.S. citizens have proven more likely to provide the types of information contemplated if they receive some benefit from such sharing.  The European Union, on the other hand, has adopted a stringent, uniform approach to data protection, which is wide-ranging and extends across all industries.  It follows that EU citizens may be more sensitive than other geographies to sharing information.  It is likely that automobile manufacturers will need to take such variations into account when implementing telematics systems.  Regardless of geography, drivers should not only look to the manner in which data is being used today, but also contemplate tomorrow, as the expansion of use case is likely a “not if, but when” scenario.  For this reason, the answer to why a person drives more cautiously may be the same as to why his or her grocery bill mysteriously increased last month: “My car made me do it!”

OTHER THOUGHT LEADERSHIP POSTS:

IoT Device Companies: The FTC is Monitoring Your COPPA Data Deletion Duties and More

By Jennifer Thompson See all of Our JDSupra Posts by Clicking the Badge Below Recent Federal Trade Commission (FTC) activities with respect to the Children’s Online Privacy Protection Act (COPPA) demonstrate a continued interest in, and increased scrutiny of,...

Predictive Algorithms in Sentencing: Are We Automating Bias?

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below Although algorithms are often presumed to be objective and unbiased, recent investigations into algorithms used in the criminal justice system to predict recidivism have produced compelling...

My Car Made Me Do It: Tales from a Telematics Trial

By Dawn Ingley See all of Our JDSupra Posts by Clicking the Badge Below Recently, my automobile insurance company gauged my interest in saving up to 20% on insurance premiums.  The catch?  For three months, I would be required to install a plug-in monitor that...

When Data Scraping and the Computer Fraud and Abuse Act Collide

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below As the volume of data available on the internet continues to increase at an extraordinary pace, it is no surprise that many companies are eager to harvest publicly available data for their own use...

Is Your Bug Bounty Program Uber Risky?

By Jennifer Thompson See all of Our JDSupra Posts by Clicking the Badge Below In October 2016, Uber discovered that the personal contact information of some 57 million Uber customers and drivers, as well as the driver’s license numbers of over 600,000 United States...

IoT Device Companies: COPPA Lessons Learned from VTech’s FTC Settlement

By Jennifer Thompson See all of Our JDSupra Posts by Clicking the Badge Below In “IoT Device Companies:  Add COPPA to Your "To Do" Lists,” I summarized the Federal Trade Commission (FTC)’s June, 2017 guidance that IoT companies selling devices used by children will be...

Beware of the Man-in-the-Middle: Lessons from the FTC’s Lenovo Settlement

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below The Federal Trade Commission’s recent approval of a final settlement with Lenovo (United States) Inc., one of the world’s largest computer manufacturers, offers a reminder that when it comes to...

#TheFTCisWatchingYou: Influencers, Hashtags and Disclosures 2017 Year End Review

Influencer marketing, hashtags and proper disclosures were the hot button topic for the Federal Trade Commission (the “FTC”) in 2017, so let’s take a look at just how the FTC has influenced Social Media Influencer Marketing in 2017. First, following up on the more...

Part III of III | FTC Provides Guidance on Reasonable Data Security Practices

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below This is the third in a series of three articles on the FTC’s Stick with Security blog. Part I and Part II of this series can be found here and here. Over the past 15 years, the Federal Trade...

Apple’s X-Cellent Response to Sen. Franken’s Queries Regarding Facial Recognition Technologies

By Dawn Ingley See all of Our JDSupra Posts by Clicking the Badge Below Recently, I wrote an article outlining the growing body of state legislation designed to address and mitigate emerging privacy concerns over facial recognition technologies.  It now appears that...

Page 1 of 1812345...10...Last »