Effective May 25, 2017, privacy statements that are “materially inconsistent” with how you handle consumer information violate a new amendment of the Oregon Unlawful Trade Practices Act (“UTPA”).[1]  A consumer may file a complaint against you based on the belief that you handle consumer information inconsistent with the terms of your privacy statements.  The Oregon Attorney General (“AG”) or a local district attorney (“DA”) has discretion to initiate an “investigative demand,” enforce a voluntary settlement, an “assurance,” that requires you to pay restitution, punitive damages to the injured consumers or enjoin your behavior.  Granted, you have the right to defend your interests and recoup damages that you incur due to a frivolous or malicious complaint.  However, in responding to an investigative demand, you still have to review your statements and processes to demonstrate consistency between your processes and statements to the FTC, AG or DA.  The first step in the journey is to conduct an audit proactively.

The amendment reinforces that performing an annual review of privacy statements and information handling processes is a good practice.  A responsible, comprehensive review should include your online privacy policy and the privacy statements in consumer contracts, cloud-hosting agreements, registration requirements and any other documents that describe your information processes and the information you require consumers to provide to you.  You should know how you and your organization process data before an AG or DA compels a public review by way of an investigative demand.

A “material inconsistency” between what you practice and what you preach violates the amendment to the UTPA.  Privacy statements about consumer information explain how, and for what purpose, the information is:

  • Collected,
  • Used,
  • Stored,
  • Disclosed,
  • Discarded, or
  • Deleted.

If you state that you only disclose consumer information to your vendors, you should only disclose the information to vendors.  If you disclose information with third party marketers without updating your privacy statements to identify the change in your process, you are taking action that is materially inconsistent with your privacy statement.

Your privacy statement should notify consumers of their right to control the information you collect and how you use it. Good privacy statements also explain why you need certain consumer information to meet your contractual obligations to the consumer.  Ideally, consumers will understand precisely what you will and will not do with their information after reading your privacy statements.  If consumers cannot understand your privacy statements, the probability of confusion and a resulting complaint increases.  Clarity is important in privacy statements to minimize the chance that consumers believe compliant companies use consumer information inconsistently with public privacy statements.

Now is the time to review the privacy statements in your consumer contracts, websites and publications.  Conducting a voluntary audit serves several purposes, including:  (1) avoiding a violation, (2) demonstrating your commitment to protect consumer information and (3) signaling your desire to meet the spirit of the amendment.  Additionally, if you are the subject of an investigative demand, you will benefit from already having information from your audit to formulate a defense.  Having current information immediately available to the investigating office may substantiate your commitment to comply voluntarily with the amendment.

Investigative demands may encourage voluntary compliance with the amendment. The stated purpose of an investigative demand is “to receive an assurance of voluntary compliance.” The investigative demand may require the subject, under oath or otherwise, to:

  • appear and testify,
  • answer written interrogatories,Money_Plants
  • produce relevant documentary material or physical evidence for examination,
  • agree or respond to an order restraining/halting the alleged unlawful practice, or
  • deliver an assurance of voluntary compliance.

O.R.S. § 646.618.

An assurance of voluntary compliance sets forth what actions, if any, the subject of the investigation intends to take with respect to the alleged unlawful trade practice.  The prosecuting attorney has discretion to reject a voluntary assurance, obtain a temporary restraining order or institute a collection or enforcement action for unpaid monies or violation of the assurance. O.R.S. § 646.632(3) and (7). The voluntary assurance may include the payment of restitution to consumers who have . O.R.S. §§ 646.632.  The court may also impose penalties up to $25,000 per violation in some circumstances and an award of reasonable attorney fees and costs at trial and on appeal.  O.R.S. § 646.642.  A “violation” includes each separate occasion on which you mishandle consumer information for each consumer, so there may be multiple violations associated with one consumer.

Investing in the review of privacy statements alongside your consumer information processes may be cheaper than responding to an investigation.  There are many “hidden” expenses associated with responding to an investigative demand:

  • Potentially harming your reputation with consumers and your business associates.
  • Responding to the investigation may distract employees and owners from doing their jobs.
  • Incurring unbudgeted legal fees to defend yourself and/or to prosecute your claim for a frivolous complaint.

Best practices:  Avoid the stress, strain and worry of an investigation by taking a few hours to audit your policies, contracts and processes at least once a year.

  • If you regularly conduct business in a state that enacts amendments and new legislation on a specific date, schedule your audit before the effective date to avoid a last minute scramble to comply.
  • Monitor the FTC’s decisions, rules and enforcement actions.  You can register through the FTC and most state attorney general websites to receive information about consumer privacy enforcement, legal updates and tips to help businesses comply with consumer protection rules: https://www.ftc.gov/stay-connected.
  • If you conduct business primarily online, see where your customers are located and familiarize yourself with the laws in those jurisdictions.[2]
  • You may want to test whether someone with a demographic profile similar to your typical customer understand your privacy statements.
  • Review the Consumer Complaint Form in the jurisdictions where you conduct business.  They generally identify specific classes of people the enforcement agency has a particular interest in protecting.  For example, the Oregon Department of Justice considers whether the complainants are veterans, over the age of 65 or speak English as a second language:  http://www.doj.state.or.us/consumer/pdf/consumer_complaint.pdf.
  • Ask someone unfamiliar with your business to read your privacy statements to test them for clarity.
    Money_Tree

Conducting a voluntary audit may be money well spent. If you do not have access to a money tree, run the numbers and make the business decision that works for you.

[1] The amendment is similar in application and remedy to the “deceptive” and “unfair practices” consumer protection regulations the Federal Trade Commission (“FTC”) enforces.  https://www.ftc.gov/news-events/media-resources/protecting-consumer-privacy/enforcing-privacy-promises.

[2] California, for example, requires entities that collect personal information from California residents to include statements regarding online privacy in privacy policies.  https://oag.ca.gov/privacy.

An earlier version of this blog indicated that consumers have a private right of action under the amendment.  At this time, enforcement actions are expressly available only to government actors.  Presumably, the legislature will  determine whether there is an implied right of action.

Washington State Begins Beta Testing Its Pioneering Privacy Modeling App

Seattle — and by extension, Washington — has a purist attitude in its fusion of technology and life. All other states offer their employees manuals, a help line, and if well-funded, a legal department to answer questions about privacy. If employees are lucky enough to have access to legal advice, they may be able to submit a question, receive a response and finally execute on the advice within a few business days. That’s sad.

The Washington State Office of Privacy and Data Protection is leaving those other states eating Mt. Rainier dust. Through collaboration with policy makers, attorneys and tech gurus, Alex Alben, the Chief Privacy Officer in the state’s Office of the Chief Information Officer, has realized his stated goals of examining privacy policies across state agencies and strengthening protections for personal data. The newly launched Privacy Modeling application (PM App), an initiative of the Office of Privacy and Data Protection, provides links to relevant federal and state law regulating security of data collected from individuals. Users immediately receive answers regarding what must, may or never ever can be done with different types of personal data.

The PM App does not just spit out a canned response to inform the user that social security numbers should be protected. Instead, the PM App offers truly helpful analysis of relevant statutes for people who are on the frontline of protecting citizen data being processed by the government. For example, a clerk working in the licensing department may want to know whether the agency can sell information that identifies recently licensed, male aestheticians within a certain age range, who are veterans, who have worked for at least three employers, and who have received disability benefits. By running a query, the user will learn that data such as first and last names, employment addresses, gender and age can probably be sold, but data related to veteran status and disability status must be shielded from disclosure.

MIND BLOWN.

How the PM App Works

Users select the data points to be analyzed:

  • The sector served by the agency, such as banking, health and medical, or education;
  • Types of data that the agency handles to fulfill its mission, such as veteran records, audio recordings, and driver or professional license numbers; and
  • Whether the agency plans to sell or market the data, share it with third parties or use it to grant social benefits.

The PM App analyzes the selections and produces a Results Matrix that uses color-coding to identify statutes that apply to the data and whether the proposed use of each type of data is “No Specific Privacy Law Found,” “Allowed With Limitations” or “Forbidden,” with links to the texts of relevant laws. Users also receive information about the laws and policies excluded from the PM App analysis, the statutes that should be reviewed by an attorney and a reminder that data must be collected in a lawful manner.

The PM App is a giant leap forward for the owners of the data, the agencies and their employees.

Why the PM App Is Important and Revolutionary 

Government agencies that process data have bad reputations for being invasive and self-interested to the detriment of the individual. It’s no accident that the DMV is the portal to hell in a television and comic book series. Results of a search for “big brother” include words like, “scary,” “creepy” and “overbearing.”[1]  Government has been characterized as stripping away the humanity of its subjects[2] or mining them for data. The PM App is a rehab of sorts — it embodies the self-regulatory spirit adopted by many industry sectors in the U.S.

The PM App also signifies the state’s interest in protecting the data entrusted to it.  Let’s be clear: people generally have no option to withhold their personal information if they want to benefit from government services. The privilege of conducting business in some industries requires the provision of fingerprints, background checks and photographs.  Information about our professional licenses are regularly marketed.

Initiatives like the PM App reduce cynicism about having to turn over personal data in order to pursue a dream of becoming a pyrotechnic operator. Development of the PM App signifies the state is invested in protecting our data. If George Orwell is accurate, and I am no more than a collection of my data, I welcome this added layer of protection.

The PM App also has significance for the users. Subject to the express and implied limitations in the User Guide, the PM App immediately provides links to federal and state statutes, and an analysis of whether the proposed use of the data complies with those statutes, enabling users to gain significant familiarity with the statutes that would ordinarily be accessible only through legal counsel. Users who desire to delve deeper into the laws may review definitions, exemptions and penalties to contextualize the laws to their agencies and privacy policies.  Feedback from the testing is anticipated to reveal additional intangible benefits to performance accuracy and morale.

You Too, Can Have an Attorney in a Box

 

 

 

 

One of the most basic tasks that attorneys do is identify and analyze laws that are relevant to their clients. The PM App empowers employees who handle personal data by granting them access to laws that directly affect if, how and when data can be handled throughout its life cycle. In accordance with the requirements of the funding source, the PM App will be available through open access. The PM App Guide shows developers how they may tailor the PM App to their own uses. Caution: The PM Guide is great, but attorney advice is still very necessary when making tough calls about privacy issues.

Click here to access the PM App beta test. You can provide feedback or ask questions via an embedded link on the site. You can also gain additional information about the process of developing the PM App here.

Keep your eyes on the PM App, the Office of the Chief Information Officer, and trends in privacy. A conference focusing on data protection in the evolving digital environment is scheduled for early 2017 in Washington. Keep an eye on everything.

It’s when you look away that you are apt to get hurt.


[1] “Big Brother isn’t watching. He’s singing and dancing. He’s pulling rabbits out of a hat. Big Brother’s busy holding your attention every moment you’re awake. He’s making sure you’re always distracted. He’s making sure you’re fully absorbed.”
Chuck Palahniuk, Lullaby

[2] “Does Big Brother exist?”
“Of course he exists. The Party exists. Big Brother is the embodiment of the Party.”
“Does he exist in the same way as I exist?”
“You do not exist.”
George Orwell (1984)

The Privacy Shield in a nutshell. 

pugThe Privacy Shield permits U.S. businesses to process and control the personal data of individuals, aka data subjects, located in the European Union (EU). Without the Privacy Shield, U.S. businesses risk losing hundreds of millions of dollars if they cannot transfer personal data from the EU — businesses that cannot establish offices in the EU or negotiate agreements with each of the EU member countries will forego commerce with EU companies and data subjects. The U.S. government has agreed to enforce the Privacy Shield against U.S. businesses on behalf of EU data subjects. The U.S. government necessarily has to execute its enforcement duties with diligence. You might say, U.S. government agencies must bite as bad as they bark.

Is certification the best option for your company?

EU privacy standards that protect the data of its citizens are much stricter than those of the U.S. The EU requires U.S. companies to comply with privacy principles that comprise the EU/U.S. Privacy Shield. The U.S. Department of Commerce (Commerce Department) oversees U.S. businesses’ applications and certifications under the Privacy Shield. Your company may decide to be certified under the Privacy Shield if your business is subject to the jurisdiction of the Federal Trade Commission (FTC) or Department of Transportation (DOT); and EU citizens access your website, they do business with you or you conduct business in an EU member country. Each circumstance must be analyzed on a case-by-case basis. Issues such as volume, whether you are a data controller or processor, and whether you have multinational affiliates have bearing on your analysis.

How does the Privacy Shield compare to the Safe Harbor?

The Privacy Shield is more stringent than the Safe Harbor; some privacy principles that were merely guidelines under Safe Harbor are now affirmative covenants under the Privacy Shield. The U.S. government also must meet a higher standard under the Privacy Shield. The EU obligates the FTC and DOT to investigate and enforce penalties against U.S. companies that violate the Privacy Shield Principles.

What is the cost of certification?

While certification under the Privacy Shield is voluntary, U.S. businesses that receive personal data transfers from the EU must meet the same requirements as U.S. businesses that are certified. The fees for certification are based on the business’ annual revenue: the minimum fee is $250 per year for up to $5 million in revenue, and the maximum fee is $2,500 per year for more than $500 million in revenue. U.S. companies that are required to resolve disputes by an EU Data Privacy Authority must pay additional fees.

The application process itself is no more complicated than most other business certification processes.  The “real” cost of becoming certified under the Privacy Shield will likely be in personnel resources, especially if the business is not already compliant with the Safe Harbor rules.  For example, the business must dedicate personnel to develop privacy policies, educate employees about the policies, monitor the actions of employees and third party data processors, and take action against parties who violate the policies. There are also costs associated with verifying that third party processors update their security and privacy policies in step with Privacy Shield requirements.  You can review a summary of the five basic steps U.S. businesses must take to apply for certification here. You can review the seven Privacy Shield Principles here.

Alternatives to self-certification under the Privacy Shield.

It may be more cost effective for a business with limited personnel to use a private company to assist with the certification process, establish compliant policies and procedures, and provide ongoing monitoring, auditing, education and advice. The Commerce Department maintains an ever-expanding list of companies that transfer data to U.S. companies from abroad in compliance with the Privacy Shield[1] and the Madrid Resolution, U.S./Swiss Safe Harbor or the privacy rules adopted by the Asia-Pacific Economic Cooperation EU, European Economic Area, Switzerland and Asia Pacific Economic Cooperation.  When evaluating private companies, you should pay close attention to which party to the agreement is liable for violations of the Privacy Shield and the extent to which the contract covers transfers of data to third parties.

Binding Corporate Rules (BCR), model contract clauses and unambiguous consent are also options that you may consider if self-certification is unfeasible for your business.  BCRs are available to multinational companies.  An affiliated company located in the EU may transfer personal data to its U.S. location subject to BCRs.  Model Contracts, drafted by the European Commission, require U.S. businesses to provide adequate levels of protection of the privacy of data subjects.  If you are a data processor, not a data controller, you may have the option of entering into a Direct Data Processing Agreement or adopt the Model Clauses for Processors to eliminate the negotiation of broader issues that apply to controllers, but not processors. If you receive data from a limited number of known EU data subjects, the most cost effective way for you to transfer their data to the U.S. would be to obtain from each of them clear, unambiguous statements that they freely permit the transfer of their personal data.

What are the possible repercussions of not complying with the Privacy Shield?

BearThe FTC can investigate alleged violations of the Privacy Shield, enter consent orders and findings of contempt, and impose administrative penalties. Currently, administrative penalties may be up to $40,000 per violation or per day, for continuing violations. Additional penalties against a business include the FTC’s removal of a company from the Privacy Shield list, resulting in liability under the False Statements Act if the company claims it is certified. Learn from the lessons of others — the FTC has issued record-breaking fines in the past two years, including a $1.3 billion fine issued in the past month. The data owners in the EU, the EU Commission and/or data privacy authority may also have private rights of action against a U.S. company that violates relevant rules.

 

The wrap-up:

  • Assess how your U.S. based business receives personal data from EU data subjects. Based on the volume, your relationship to the data owners, and whether you process or control the data, you may have to delegate an employee or contractor who is knowledgeable about data privacy and cybersecurity to monitor, update and enforce the policy and verify that the privacy notice meets all applicable state, federal and international rules.
  • Consult all aspects of your company organization to assess which option is best for you. Privacy is not a distinct division within your company. Verify that operations, human resources and enforcement of policies work in concert to maintain the standards of the Privacy Shield.

[1] See, the “Privacy Shield List” at https://www.privacyshield.gov/list.

What’s the Case About? In re Nickelodeon Consumer Privacy Litigation[1] is a multi-district consolidated class action filed on behalf of children under the age of thirteen alleging that Viacom used child directed websites it owned to collect, without parental consent, data from the class members which it then provided to co-defendant Google. The data Viacom captured from children included their gender, birthdate, IP address, the webpages visited and the names of videos the children viewed. The court considered an issue of first impression as to whether an IP address is personally identifiable information (PII) under the Video Privacy Protection Act (VPPA) and whether the collection of the data constituted intrusion upon seclusion under New Jersey law. Plaintiffs argued that the vulnerability of children coupled with public aversion to mining them for data supported liability of Viacom.[2]

VPPA allegations dismissed: The court held that Viacom did not violate the VPPA by collecting the IP addresses of children. The decision was based in part, on the precedent set by In re Hulu Privacy Litigation.[3] The Hulu court determined that static digital identifiers such as IP addresses identify the location of a computer, which, without additional information, cannot be used to identify an individual. Under this rationale, an IP address is not PII, because an address alone cannot “reasonably” lead to the identification of a person. The court also noted that the VPPA is just too old and brittle to encompass technology so distant from its origins as a by-product of Blockbuster, Erol’s Video Club and Hollywood Video stores.   A nuance as to why Google escaped liability under VPPA for another reason is touched upon below.

New Jersey state law claims remanded: The court remanded the claim against Viacom for violation of the New Jersey intrusion upon seclusion law. The court did not look favorably upon Viacom’s failure to honor its notice to parents that it would not collect any data from children.

The allegations against Viacom: Viacom owns the websites, Nick.Jr. and Nick.com (Nick Sites), both of which are associated with the Nickelodeon channel. The Nick Sites offer games and streaming videos to children and included this notice to parents:

HEY GROWN-UPS: We don’t collect ANY personal information about your kids. Which means we couldn’t share it even if we wanted to![4]

When children registered on one of the Nick Sites, they received a nickname of an avatar based on a Nickelodeon cartoon character of the same gender and approximate age as the child. The plaintiffs alleged that Viacom used first-party cookies it placed on the children’s computers to obtain information about which games and videos the children accessed.  Viacom disclosed the information it collected to Google and permitted Google to place ads on the Nick Sites.

The allegations against Google: The plaintiffs alleged that Google (1) placed third-party cookies via advertisement on the computers of children who accessed the Nick Sites, (2) used those cookies to track the children on any website displaying a Google ad, and (3) used “Doubleclick.net cookies”[5] to track the browsing of whomever used the computer across any website Google owned, such as Gmail, YouTube and Google Maps.

Analysis of the VPPA: Congress enacted the VPPA after the 1987 Senate Judiciary Committee’s hearings regarding Supreme Court nominee, Robert Bork. During the hearings, a newspaper obtained and publicized a list of titles of 146 films Judge Bork or members of his family rented from a local video store.[6] The list of videos was, even by 1987 standards, unremarkable — not a single NC-17 film on the list. Congress agreed, however, that a person’s video viewing history should be private. Consequently, under the VPPA, a user must give permission for his or her video viewing data to be shared. How does this translate to current technology? It doesn’t. The court likened applying the VPPA to internet technology to putting a square peg in a round hole.[7] Additionally, the court referred to the VPPA as a rigid law that lacked the flexibility of the Children’s Online Privacy Protection Act (COPPA) to regulate effectively technology that is “in flux.”[8]

The key definitions under the VPPA are:

Consumer: any renter, purchaser or subscriber of goods or services from a video tape service provider.

Video tape service provider: any person, engaged in the business, in or affecting interstate or foreign commerce, of rental, sale or delivery of prerecorded video cassette tapes or similar audio visual materials.

Personally identifiable information (PII): includes information which identifies a person as having requested or obtained specific video materials or services from a video tape service provider.

18 U.S.C. § 2710(a). A violation of the VPPA occurs when “[a] video tape service provider … knowingly disclose[d], to any person, personally identifiable information concerning any consumer of such provider.” Id.

The VPPA was created to protect information specific to viewers of movies. The court noted that if the definition of PII was expanded for all statutes to include an IP address, there would be no end to litigation and the distinctions between the groups protected by certain statutes would be eroded. Congress’ decision to omit a new definition of PII in the 2013 amendment of the VPPA further emphasized that the VPPA “serves different purposes, and protects different constituencies, than other, broader privacy laws.”[9] For example, if “Google were to start purposefully leaking its customers’ YouTube video-watching histories,” the VPPA “would almost certainly” be violated.[10]

Extending the VPPA to regulate current technology would likely result in unlimited violations. Defining an IP address as PII within the context of the VPPA would mean that “the disclosure of an IP address to any Internet company with registered users might trigger “liability” when an IP address is regularly transmitted to an Internet service provider (ISP) with each search.[11] The court also pointed out that there is a spectrum of PII, with first and last name at one end, and an IP address at the other, lower, end of the spectrum, given that the IP address alone may be insufficient to identify a person. The case cited by the court to illustrate the need for a subpoena to identify a person is a copyright infringement case, Warner Bros. Records Inc. v. Walker, 704 F.Supp. 2d 460 (W.D. Pa. 2010). Warner Bros. needed a subpoena to identify the student who was assigned the IP address used to illegally download some songs. The student, who shared a room with multiple roommates, possibly would not have been identified without a subpoena, given that several people may have used the computer. It was not “reasonably” likely for Warner Bros. to identify the person responsible for the downloads without a subpoena. Understandably, a subpoena may be necessary in a fluid environment such as a college where multiple people may have access to a computer.

Time-out: It’s one thing for Warner Bros. to need help from the college to identify which of multiple people may have used an IP address assigned by the college. It’s something altogether different when Google, which the court describes as “a company whose entire business model is purportedly driven by the aggregation of information about Internet users,” wants to identify a person. The plaintiffs’ amicus very astutely provided some real-world perspective about what happens when Google wants to find out who you are: “concluding ‘that Google is unable to identify a user based on a combination of IP address … and other browser cookie data … would be like concluding the company that produces the phone book is unable to deduce the identity of an individual based on their telephone number.’”[12] Enough said. Resume play.

The court affirmed the dismissal of intrusion upon seclusion claim against Google: Although the court acknowledged that many people, and some courts, find the monetization and collection of data from children without parental consent repugnant, those acts alone, did not establish a claim for intrusion upon seclusion. Under New Jersey law, an intrusion upon seclusion claim occurs upon a showing that (i) an intentional intrusion (ii) upon the seclusion of another that is (iii) highly offensive to a reasonable person.[13] The court disregarded the fact that children, instead of adults, were tracked, because third-party cookies serve a legitimate commercial purpose for advertisers and Google uses them on the Nick Sites the same way it uses them on other non-child directed sites.

This is why Viacom may be liable for intrusion upon seclusion: When Viacom notified parents that it did not collect any personal information about children, it was reasonable for a jury to decide that parents may have permitted their children unsupervised access to the Nick Sites based on the disclaimer. If the parents of the plaintiff class members didn’t already have an expectation of privacy, Viacom’s notice created an expectation of privacy. Viacom’s violation of that trust by surreptitiously collecting data from children could be considered highly offensive under the applicable law.

Summary

COPPA and HIPAA V2An IP address has been likened to a computer’s fingerprint. If a statute identifies an IP address or other static assigned number as PII, that number is a great starting point to identify a user. For example, under COPPA and HIPAA, an IP address is as high on the spectrum of PII as a user’s first and last name. The rationale behind the ranking of an IP address in these statutes is that sometimes it is reasonable that an IP address can lead you to the user. Who’s looking for you also matters. It is reasonable that Google, using third-party cookies, can reasonably use your IP address to identify you.

Sometimes an IP address can only identify a computer, i.e., it cannot “reasonably” be used to identify you. Without a subpoena or some alternate means of creating a mosaicked identity, you may have to resort to battling “John Doe” until a subpoena grants you the right to retrieve additional information about the IP address. In these instances, IP addresses are not considered to be PII. At the end of the day, you have found a computer. Good job.

 

What did we learn?

  • Don’t over sell your privacy policy. Viacom faces potential liability because it violated its own privacy notice to parents.
  • Do the right thing. Don’t get information from children under the age of 13 that is defined as PII under any privacy law without parental consent. These days there are few things about which 90 percent of Americans agree — Viacom’s actions on the Nick Sites are considered to be highly offensive.
  • Now that you know that sometimes children’s browsing history, IP address and other information can be collected through third-party cookies without parental consent, educate your children. The Federal Trade Commission provides guidance on helping children distinguish between ads and entertainment: https://www.commonsensemedia.org/website-reviews/admongo.
  • Understand that a trade-off for having the world at your fingertips may mean sharing your computer’s fingerprint with inquiring minds.

Stay safe.


[1] In re Nickelodeon Consumer Privacy Litigation, 2016 WL 3513782 No. 15-1441 (3rd Cir. Jun. 28, 2016).

[2] Id. at * 4 (alleging that (1) targeting ads to children is more profitable than targeting ads to adults, in part, “because children are generally unable to distinguish between content and advertisements;” (2) 80% and 90% of 2,000 adult respondents, respectively, oppose an advertiser’s tracking of children and believed advertisers should obtain a parent’s permission before installing cookies on a device used by a minor child; and (3) companies can use “browser fingerprinting” to identify specific users).

[3] In re Hulu Privacy Litigation, No. 11-CV-3764 (LB), 2014 WL 1724344 (N.D. Cal. Apr. 28, 2014).

[4] In re Nickelodeon, 2016 WL 3513782 No. 15-1441 at * 3 (3rd Cir. Jun. 28, 2016).

[5] Id.

[6] Michael Dolan, “The Bork Tapes,” City Paper, Sept. 25–Oct. 1, 1987, at 1.

[7] In re Nickelodeon, 2016 WL 3513782 No. 15-1441 at * 15 (3rd Cir. Jun. 28, 2016).

[8] Id., at * 16-19.

[9] In re Nickelodeon, 2016 WL 3513782 No.: 15-1441 * 19 (3rd Cir. Jun. 28, 2016).

[10] Id. at * 17.

[11] Id. at * 20-21.

[12] Id. at * 20.

[13] Hennessy v. Coastal Eagle point Oil Co., 609 A.2d 11, 17 (N.J. 1992 (citing Restatment (Second) of Torts § 652B (1977)).

“A reputation once broken may possibly be repaired, but the world will always keep their eyes on the spot where the crack was.” ― Joseph Hall

Consumers may be injured by inaccurate data that they cannot review or correct. There’s a hole in the bucket, dear Congress.[1] 

The children’s song, “There’s a Hole in the Bucket,” exemplifies the conundrum many consumers experience when they are denied opportunities or inappropriately solicited. Data brokers maintain files with over 1,500 pieces of pieces of personal data on each of us. There are over 3,500 data brokers in the U.S. Only about one-third of them permit individuals to “opt-out” of inclusion in their data banks, usually for a fee. Unless and until you recognize an unexplained pattern of lost job opportunities, rejected apartment applications or are targeted by unsolicited marketing, you may not care what data brokers maintain in your files.

Imagine this: You are 22 years old and gung-ho to use your brand-spanking new business organization degree as an entry-level traveling corporate trainer. You grant recruiters the right to conduct background checks after they indicate their interest in you based on your resume. You get rejection after rejection. You finally muster the courage to call a recruiter and ask why, and she explains that you are not a good fit based on background information that describes you as 39 years old, the parent of four young children, having a Ph.D. and a sufferer of agoraphobia. None of this information is true.

Your prospective employers may have relied on information provided by data brokers or credit rating agencies (CRA) in determining that you are not a viable candidate. Now that you know inaccurate information is being reported about you, you are confident that you can correct your files and the employers will reverse their decisions. You can if the inaccurate information is from a CRA. But if data brokers provided the incorrect information, you will find yourself in the miserable position of knowing your files are wrong and being powerless to correct them. You know prospective employers have considered inaccurate information about you, but you don’t know which employers relied on which data brokers or which inaccuracies in your files made you undesirable for hire. You don’t know how many data brokers have files on you or what evidence you can provide to disprove the inaccurate information about you. You and “Dear Henry” share the predicament of wanting to fix the hole in the bucket but lacking the tools to do so.

Let the screening begin.

Many decisions about consumers, job applicants and second dates are based on inaccurate information provided by data brokers. Data brokers sell consumers’ personally identifiable information (PII) to be used in marketing, people searches, fraud detection and risk management. The FTC defines data brokers as “companies that collect information, including personal information about consumers, from a wide variety of sources for the purpose of reselling such information to their customers for various purposes, including verifying an individual’s identity, differentiating records, marketing products, and preventing financial fraud.”[2]  The Fair Credit Reporting Act (FCRA) applies to CRAs like Experian, TransUnion and Equifax, not data brokers. CRAs must take reasonable steps to ensure the accuracy of the consumer PII they distribute and they must provide consumers the opportunity to review and correct their records. There are zero federal statutes or regulations affecting data brokers in this regard. If enacted as introduced in 2009, the Data Accountability and Trust Act (DATA) would provide procedures for individuals to audit and verify the accuracy of data held by data brokers. The swathe of data collected by data brokers is astounding and troubling. Add the fact that data brokers are generally less expensive to use than CRAs, and employers and individuals are at a distinct disadvantage relative to data brokers.

Here’s what’s in the bucket.

Reports about consumers are based on information showing what they own or use; who they are with, have lost or are fighting; how they make, save and spend their money; and what interests or ails them, including mental, genetic and “other” diseases that may be embarrassing.[3] For example, when you register your car, record a deed of trust, activate a warranty, join a Facebook group, fill a new prescription, or get sued, married, divorced or widowed, data brokers collect that information. It is tacitly understood that PII from data brokers is not accurate and enables discrimination in hiring, the provision of resources and opportunities.[4] Consumer advocacy groups report that information used in people search sites is not vetted — the consumer has the responsibility of figuring out which of 67 people named “Pamela Samuelson” authored Protecting Privacy Through Copyright Law?. Marketing information is more accurate, but is still unreliable. For example, a data broker may correctly report that a household member purchased a new car, but err by addressing car wash coupons to the resident third-grader. Risk mitigation information is the most accurate, because it is expected to at least correspond to the correct person, even if the results are outdated.

This brings to mind a character on a show who changed his name, because he shared it with well-known artist who was convicted of sex crimes. His new name, unfortunately, was shared with a well-known artist who was convicted of murder. How do you feel knowing that you may be judged by the bad report of someone who has a name similar to yours? The identities of entities using bad data may influence your answer.

Who’s looking in the bucket?

Financial institutions, government agencies, political organizers and insurance companies use the services of data brokers. As of May, one of the largest data broker’s customers included “47 Fortune 100 clients; 12 of the top 15 credit card issuers; seven of the top 10 retail banks; eight of the top 10 telecom/media companies; seven of the top 10 retailers; nine of the top 10 property and casualty insurers; three of the top five domestic airlines; and six of the top 10 U.S. hotels.”[5] How likely are you to recognize that after your namesake niece filed for bankruptcy that the hotel prices you were offered increased by 18%?

FactsCan you look in the bucket?

No. If data brokers filled the bucket, no federal law gives an individual the right to look in the bucket. A subpoena or other discovery procedure may be your best option to see your file. If a CRA filled the bucket, yes, an individual has the right to review and correct the information in the bucket.

What can you do?

  •  Educate yourself about your rights. See whether your state has any laws that offer you protection. California, for example, shields victims of violent crimes from having their PII publicized on the internet.
  • Opt out of as many of the data broker sites as is reasonable. Visit this website to get started: http://www.computerworld.com/article/2849263/doxxing-defense-remove-your-personal-info-from-data-brokers.html.
  •  Lobby your federal and state legislators and align yourself with organizations that advocate for the right to control your PII.

Stay safe.


[1] There’s a Hole in the Bucket, Songwriters: Harry Belafonte, Odetta Felious Gordon © Next Decade Entertainment, Inc.

[2] FTC, Protecting Consumer Privacy in an Era of Rapid Change, at 68 (Mar. 2012).

[3]Steve Kroft, The Data Brokers: Selling Your Personal Information, 60 Minutes, ep. “The Data Brokers” aired on CBS Mar. 9, 2014, http://www.cbsnews.com/news/the-data-brokers-selling-your-personal-information/.

[4] Exec. Office of the Pres., Big Data: Seizing Opportunities, Preserving Values, pp. 51-53, May 2014, http://www.whitehouse.gov/sites/default/files/docs/big_data_privacy_report_may_1_2014.pdf.

[5] U.S. Senate Commerce Committee, A Review of the Data Broker Industry: Collection, Use, and Sale of Consumer Data for Marketing Purposes  (December 2013).

Andy Johnson-Laird, President of Johnson-Laird, Inc., was kind enough to offer advice on three security techniques he recommends to detect and deny hackers, such as those we discussed in last week’s blog post.

  1. Intrusion Detection Systems identify external probing or port scanning of known IP addresses. He describes it as “the equivalent of lying in bed at night and listening for someone rattling your front door handle. It’s a more or less constant rattling when it comes to port scanning.” See https://en.wikipedia.org/wiki/Intrusion_detection_system.
  2.  Network Address Translation (NAT-ing) prevents the routing of internal IP addresses. Internal IP addresses will be selected from the range of “non-routable” addresses reserved by the Internet Engineering Task Force and the Internet Assigned Numbers Authority. See https://en.wikipedia.org/wiki/Reserved_IP_addresses.
  3. Internet Protocol Security (IPSec) is an end-to-end security scheme that requires mutual authentication at the beginning of communication and the negotiation of cryptographic keys during the session. See https://en.wikipedia.org/wiki/IPsec.

News Flash, May 12, 2016:

The Department of Justice proposed that the biometric database the FBI has been amassing for eight years should be exempt from privacy laws.[1] If approved, the proposal would free the FBI to save images of faces, handprints, tattoos, iris scans and biographies of people who don’t know of the existence of, let alone the content of, their files. The largely unvetted findings of investigations other federal agencies conducted on job applicants will also be included in the database. In a nutshell, anyone could be detrimentally affected based on inaccurate information and never know it. Stay tuned for updates on this developing proposal.


[1] See, Tim Cushing, Doesn’t Want Privacy Laws To Apply To Its Biometric Database from the and-doesn’t-want-to-let-citizens-know-how-THEIR-privacy-is-affected dept, TechDirt, May 12, 2016, https://www.techdirt.com/articles/20160508/13574834381/fbi-doesnt-want-privacy-laws-to-apply-to-biometric-database.shtml.

Sometimes law enforcement needs a warrant to access cellphone data, sometimes a court order. Sometimes nothing is required.

 

Roaming while you roam.

Depending on where you use your cellphone, law enforcement may obtain your location records from your wireless provider without a court order or warrant — neither is required in Washington state. In urban areas where there are multiple cell towers, a phone’s location can be identified to within half a mile. Turning off your location services or powering down the cellphone alone will not shield you from law enforcement.

lemur

 

Are you where you said you’d be?

When processing data, cellphones communicate with the strongest available cell tower signal. Calls, texts and internet browsing generate cell-site location information (CSLI). CSLI is time-stamped and linked to the phone number. Cell towers emit different signals in each direction, so the phone’s movement is tracked by its angular position relative to a tower. From CSLI, law enforcement can track where you are; what cell phones are near you; with what phone numbers you communicate; how long you communicate; and what routes you travel. In essence, everything except the actual communication is recorded. If you aren’t a criminal, you may not care. Depending on how CSLI is produced, even the innocent can catch the attention of law enforcement.

 

Despite your innocence, law enforcement may receive your cellphone usage records.

“Tower dumps” entail producing CSLI for all cellphones that processed data sent through a tower. If your cellphone uses a targeted tower, your data may be captured. You go from having fun with your friends to law enforcement tracking you by your number.

sheep

Producing CSLI in real-time transfers your records contemporaneously as they are created by cellphone usage.  If this happens, you are probably having a bad day — this method may be used in exigent circumstances.  Collection of historical CSLI shows all CSLI for a cellphone for a specific period of time.

 

One person’s record is another person’s dirty laundry.

In U.S. v. Timothy Carpenter[1], the Sixth U.S. Circuit Court of Appeals joined the Eleventh Circuit[2] in ruling that law enforcement agencies do not need a warrant to track a caller’s location through cell tower records. Timothy Carpenter and Timothy Sanders robbed nine cellphone stores (ironic, isn’t it?) in Michigan and Ohio within four months.

cautionThe FBI requested the “transactional records” of the Timothys’ wireless providers, aka, the Timothys’ historical CSLI. Court orders were issued pursuant to the Stored Communications Act (SCA) after the FBI showed there were reasonable grounds to believe the CSLI was relevant to the investigation. The FBI reviewed 127 days of CSLI for one Timothy and 88 days for the other. The government established through the historical CSLI that the Timothys were located within a half-mile to two miles of each armed robbery when they occurred. On appeal, the defendants argued that the Fourth Amendment required the government to show probable cause and use a search warrant to access the CSLI.

 

The opinion focused on the following:

  1. There was no search, because the FBI collected the wireless providers’ data routing information, which was gathered in the ordinary course of business.
  2. CSLI does not refer to the content of the defendants’ private communications.
  3. Every cellphone user who has paid roaming fees knows that wireless carriers collect locational information, so there was no expectation of privacy.
  4. The CSLI is so comparatively imprecise compared to GPS, that there is no expectation that the cellphone user can be located exactly.
  5. The SCA requires the government to meet the “reasonable grounds” to get a court order, not the “probable cause” standard to obtain CSLI.

 

Tracking or stalking? Duration matters.

The concurring opinion in Carpenter questioned whether the business record standard of proof applies in the review of an alleged violation of Fourth Amendment rights. The rationale behind the question is that a business’s production of credit card records showing purchases, for example, may be sufficiently distinct from the production of cellphone records showing personal location to require a more stringent analysis. The concurring justice also found the scope of the location monitoring troubling. Lawfully tailing a suspect is one thing. Lawfully tailing a suspect for a period of three to four months transmutes the surveillance into the realm of privacy invasion.

 

So, how do I keep law enforcement out of my data?

iOS and Android operating systems and apps offer some protection of CSLI location data. Start by turning off the location services for all your existing apps. Download apps that discard the location data cached on your cellphone. Get and stay off the grid by using localized Wi-Fi connections. Rely on an offline map or app that anonymizes the cellphone, encrypts the location data and permanently deletes your data within a certain amount of time. Regularly monitor new technology used by law enforcement and cybersecurity experts.

husky

After all, when you build a better mousetrap, law enforcement will build a better mouse. Justice William O. Douglas called upon his Pacific Northwest ideals when he wrote, “The right to be let alone is indeed the beginning of all freedom.” Cheers to freedom.

 


[1] www.ca6.uscourts.gov/opinions.pdf/16a0089p-06.pdf
[2] https://www.eff.org/document/us-v-q-davis-opinion