Header graphic for print

Beyond IP Law

A look at what's going on in the world of intellectual property.

Jaguar Land Rover Ltd Puts the Brakes on the Trivett Family Trust

Posted in Domain Name, Intellectual Property

In a World Intellectual Property Office (WIPO) domain name decision, WIPO has ordered the cancellation of 175 domain names that include the famous Range Rover, Land Rover and Jaguar trademarks. The domains were registered in the .au ccTLD by the Trustee for the Trivett Family Trust (Trivett). Representative samples of the domains include rangeroverservicecentre.com.au., jaguarhybrid.com.au and landrover.net.au. Jaguar Land Rover (JLR) submitted its complaint in August. The decision was rendered on October 10, 2016. The ownership of the Jaguar, Land Rover and Range Rover marks was not in dispute. Trivett submitted that the domains were acquired by Trivett for the purposes of developing a proposed “Maintain My” web platform that would connect consumers to a range of service providers, including manufacturers of both genuine and non-genuine automotive spare parts.

JLR submitted that the use of well-known trademarks together with geographic or descriptive terms creates a domain name that is confusingly similar to the well-known trademark. Furthermore, JLR did not license or permit Trivett to use the trademarks, nor was the proposed use of the trademarks tantamount to a bona fide offering of goods and services. Trivett relied upon the test set out in the Oki-Data Americas, Inc. v ASD decision that sets out four minimum factors used to help decide whether there was a bona fide use of the domain name. Those factors include:

  1. The respondent must actually be offering the goods or services (Trivett had said that it was going to use the names starting in 2017 as part of its “Maintain My” platform);
  2. Only genuine trademarked goods could be sold on the website;
  3. The site must accurately disclose the relationship between the registrant and the trademark owner, and may not falsely suggest that the registrant is the trademark owner or is an official site; and
  4. The respondent must not try to corner the market in all domain names (175!) depriving the trademark owner of its own use of the mark.

The WIPO panel discussed that the decision may have been different had Trivett been able to show development of its “Maintain My” mark and if they could show that they actually sold JLR vehicles and they had not registered 175 domains.

The result: Oki-Data is still a reasonable test for the bona fide intent. It just so happened that Trivett did not qualify for any of the prongs of the test.

Answering the Question: ‘Can third parties challenge a copyright Work Made for Hire ownership assertion?’

Posted in Copyright, Intellectual Property

A recent case out of the Second Circuit says “Yes. Third parties have standing to assert the defense that a copyright owner’s claim of ownership is erroneously based on Work Made for Hire (WMFH).”

Urbont v. Sony Music held that the defendant did have standing to challenge the validity of the WMFH ownership claim. Sony claimed that they owned music by virtue of WMFH; and Jack Urbont, defendant, challenged that WMFH status of ownership. This is noteworthy because it was Marvel, not Sony, who was the copyright claimant under a theory of WMFH.  The music at issue was the theme song to the movie “IronMan.” Marvel was an unrelated party to Sony at the time that the work was created.

The defendant rested its claim on the premise that Marvel had no employees who were composers; and that music does not qualify for WMFH  status under the nine categories of work.

The Second Circuit ruled that whether the work was indeed a WMFH was a relevant issue; and that defendants were entitled to raise the defense; and such a defense rested on facts that precluded summary judgment.

The opinion came down on July 29 of this year. Discussing it before now seemed premature as it wasn’t clear whether Sony was going to appeal. This week, Urbont is reported to be the prevailing party. Urbont v. Sony settled on confidential terms. There will be no further appeal.

Pro-tip take away:  An assignment would have solved the problem.  The nine narrow categories of WMFH applying to non-employees is increasingly being read narrowly.  “Audio-visual works” are covered in one of the nine categories, but not just audio alone (namely music). To refresh your recollection, the nine categories listed in Section 101 of Title 17 of the copyright statute are:

  1. A contribution to a collective work (like a piece for a magazine, anthology or encyclopedia);
  2. A part of a motion picture or other audiovisual work;
  3. A translation;
  4. A supplementary work (like a foreword, afterword, bibliography, appendix, index or editorial notes);
  5. A compilation (like an anthology, database or anything that qualifies as a “collective work” from category 1 above);
  6. An instructional text (generally, any text that could go in a textbook);
  7. A test;
  8. Answer material for a test; and
  9. An atlas.

And even if your work falls into one of these nine categories, the statute states that there must also be a written agreement stating that it’s a “Work Made For Hire.” To be safe, the agreement should be signed before the work is created and signed by both parties.

Urbont v. Sony Music Entertainment Inc., Second Circuit July 29, 2016,  15-1778

Small Companies and Those Not Certified Under the Safe Harbor Face Hidden Costs in the EU/US Privacy Shield Certification Process

Posted in Data Security, Privacy

The Privacy Shield in a nutshell. 

pugThe Privacy Shield permits U.S. businesses to process and control the personal data of individuals, aka data subjects, located in the European Union (EU). Without the Privacy Shield, U.S. businesses risk losing hundreds of millions of dollars if they cannot transfer personal data from the EU — businesses that cannot establish offices in the EU or negotiate agreements with each of the EU member countries will forego commerce with EU companies and data subjects. The U.S. government has agreed to enforce the Privacy Shield against U.S. businesses on behalf of EU data subjects. The U.S. government necessarily has to execute its enforcement duties with diligence. You might say, U.S. government agencies must bite as bad as they bark.

Is certification the best option for your company?

EU privacy standards that protect the data of its citizens are much stricter than those of the U.S. The EU requires U.S. companies to comply with privacy principles that comprise the EU/U.S. Privacy Shield. The U.S. Department of Commerce (Commerce Department) oversees U.S. businesses’ applications and certifications under the Privacy Shield. Your company may decide to be certified under the Privacy Shield if your business is subject to the jurisdiction of the Federal Trade Commission (FTC) or Department of Transportation (DOT); and EU citizens access your website, they do business with you or you conduct business in an EU member country. Each circumstance must be analyzed on a case-by-case basis. Issues such as volume, whether you are a data controller or processor, and whether you have multinational affiliates have bearing on your analysis.

How does the Privacy Shield compare to the Safe Harbor?

The Privacy Shield is more stringent than the Safe Harbor; some privacy principles that were merely guidelines under Safe Harbor are now affirmative covenants under the Privacy Shield. The U.S. government also must meet a higher standard under the Privacy Shield. The EU obligates the FTC and DOT to investigate and enforce penalties against U.S. companies that violate the Privacy Shield Principles.

What is the cost of certification?

While certification under the Privacy Shield is voluntary, U.S. businesses that receive personal data transfers from the EU must meet the same requirements as U.S. businesses that are certified. The fees for certification are based on the business’ annual revenue: the minimum fee is $250 per year for up to $5 million in revenue, and the maximum fee is $2,500 per year for more than $500 million in revenue. U.S. companies that are required to resolve disputes by an EU Data Privacy Authority must pay additional fees.

The application process itself is no more complicated than most other business certification processes.  The “real” cost of becoming certified under the Privacy Shield will likely be in personnel resources, especially if the business is not already compliant with the Safe Harbor rules.  For example, the business must dedicate personnel to develop privacy policies, educate employees about the policies, monitor the actions of employees and third party data processors, and take action against parties who violate the policies. There are also costs associated with verifying that third party processors update their security and privacy policies in step with Privacy Shield requirements.  You can review a summary of the five basic steps U.S. businesses must take to apply for certification here. You can review the seven Privacy Shield Principles here.

Alternatives to self-certification under the Privacy Shield.

It may be more cost effective for a business with limited personnel to use a private company to assist with the certification process, establish compliant policies and procedures, and provide ongoing monitoring, auditing, education and advice. The Commerce Department maintains an ever-expanding list of companies that transfer data to U.S. companies from abroad in compliance with the Privacy Shield[1] and the Madrid Resolution, U.S./Swiss Safe Harbor or the privacy rules adopted by the Asia-Pacific Economic Cooperation EU, European Economic Area, Switzerland and Asia Pacific Economic Cooperation.  When evaluating private companies, you should pay close attention to which party to the agreement is liable for violations of the Privacy Shield and the extent to which the contract covers transfers of data to third parties.

Binding Corporate Rules (BCR), model contract clauses and unambiguous consent are also options that you may consider if self-certification is unfeasible for your business.  BCRs are available to multinational companies.  An affiliated company located in the EU may transfer personal data to its U.S. location subject to BCRs.  Model Contracts, drafted by the European Commission, require U.S. businesses to provide adequate levels of protection of the privacy of data subjects.  If you are a data processor, not a data controller, you may have the option of entering into a Direct Data Processing Agreement or adopt the Model Clauses for Processors to eliminate the negotiation of broader issues that apply to controllers, but not processors. If you receive data from a limited number of known EU data subjects, the most cost effective way for you to transfer their data to the U.S. would be to obtain from each of them clear, unambiguous statements that they freely permit the transfer of their personal data.

What are the possible repercussions of not complying with the Privacy Shield?

BearThe FTC can investigate alleged violations of the Privacy Shield, enter consent orders and findings of contempt, and impose administrative penalties. Currently, administrative penalties may be up to $40,000 per violation or per day, for continuing violations. Additional penalties against a business include the FTC’s removal of a company from the Privacy Shield list, resulting in liability under the False Statements Act if the company claims it is certified. Learn from the lessons of others — the FTC has issued record-breaking fines in the past two years, including a $1.3 billion fine issued in the past month. The data owners in the EU, the EU Commission and/or data privacy authority may also have private rights of action against a U.S. company that violates relevant rules.

 

The wrap-up:

  • Assess how your U.S. based business receives personal data from EU data subjects. Based on the volume, your relationship to the data owners, and whether you process or control the data, you may have to delegate an employee or contractor who is knowledgeable about data privacy and cybersecurity to monitor, update and enforce the policy and verify that the privacy notice meets all applicable state, federal and international rules.
  • Consult all aspects of your company organization to assess which option is best for you. Privacy is not a distinct division within your company. Verify that operations, human resources and enforcement of policies work in concert to maintain the standards of the Privacy Shield.

[1] See, the “Privacy Shield List” at https://www.privacyshield.gov/list.

In re Nickelodeon Consumer Privacy Litigation: An IP Address Is Not Always Personally Identifiable Information

Posted in Data Security, Privacy

What’s the Case About? In re Nickelodeon Consumer Privacy Litigation[1] is a multi-district consolidated class action filed on behalf of children under the age of thirteen alleging that Viacom used child directed websites it owned to collect, without parental consent, data from the class members which it then provided to co-defendant Google. The data Viacom captured from children included their gender, birthdate, IP address, the webpages visited and the names of videos the children viewed. The court considered an issue of first impression as to whether an IP address is personally identifiable information (PII) under the Video Privacy Protection Act (VPPA) and whether the collection of the data constituted intrusion upon seclusion under New Jersey law. Plaintiffs argued that the vulnerability of children coupled with public aversion to mining them for data supported liability of Viacom.[2]

VPPA allegations dismissed: The court held that Viacom did not violate the VPPA by collecting the IP addresses of children. The decision was based in part, on the precedent set by In re Hulu Privacy Litigation.[3] The Hulu court determined that static digital identifiers such as IP addresses identify the location of a computer, which, without additional information, cannot be used to identify an individual. Under this rationale, an IP address is not PII, because an address alone cannot “reasonably” lead to the identification of a person. The court also noted that the VPPA is just too old and brittle to encompass technology so distant from its origins as a by-product of Blockbuster, Erol’s Video Club and Hollywood Video stores.   A nuance as to why Google escaped liability under VPPA for another reason is touched upon below.

New Jersey state law claims remanded: The court remanded the claim against Viacom for violation of the New Jersey intrusion upon seclusion law. The court did not look favorably upon Viacom’s failure to honor its notice to parents that it would not collect any data from children.

The allegations against Viacom: Viacom owns the websites, Nick.Jr. and Nick.com (Nick Sites), both of which are associated with the Nickelodeon channel. The Nick Sites offer games and streaming videos to children and included this notice to parents:

HEY GROWN-UPS: We don’t collect ANY personal information about your kids. Which means we couldn’t share it even if we wanted to![4]

When children registered on one of the Nick Sites, they received a nickname of an avatar based on a Nickelodeon cartoon character of the same gender and approximate age as the child. The plaintiffs alleged that Viacom used first-party cookies it placed on the children’s computers to obtain information about which games and videos the children accessed.  Viacom disclosed the information it collected to Google and permitted Google to place ads on the Nick Sites.

The allegations against Google: The plaintiffs alleged that Google (1) placed third-party cookies via advertisement on the computers of children who accessed the Nick Sites, (2) used those cookies to track the children on any website displaying a Google ad, and (3) used “Doubleclick.net cookies”[5] to track the browsing of whomever used the computer across any website Google owned, such as Gmail, YouTube and Google Maps.

Analysis of the VPPA: Congress enacted the VPPA after the 1987 Senate Judiciary Committee’s hearings regarding Supreme Court nominee, Robert Bork. During the hearings, a newspaper obtained and publicized a list of titles of 146 films Judge Bork or members of his family rented from a local video store.[6] The list of videos was, even by 1987 standards, unremarkable — not a single NC-17 film on the list. Congress agreed, however, that a person’s video viewing history should be private. Consequently, under the VPPA, a user must give permission for his or her video viewing data to be shared. How does this translate to current technology? It doesn’t. The court likened applying the VPPA to internet technology to putting a square peg in a round hole.[7] Additionally, the court referred to the VPPA as a rigid law that lacked the flexibility of the Children’s Online Privacy Protection Act (COPPA) to regulate effectively technology that is “in flux.”[8]

The key definitions under the VPPA are:

Consumer: any renter, purchaser or subscriber of goods or services from a video tape service provider.

Video tape service provider: any person, engaged in the business, in or affecting interstate or foreign commerce, of rental, sale or delivery of prerecorded video cassette tapes or similar audio visual materials.

Personally identifiable information (PII): includes information which identifies a person as having requested or obtained specific video materials or services from a video tape service provider.

18 U.S.C. § 2710(a). A violation of the VPPA occurs when “[a] video tape service provider … knowingly disclose[d], to any person, personally identifiable information concerning any consumer of such provider.” Id.

The VPPA was created to protect information specific to viewers of movies. The court noted that if the definition of PII was expanded for all statutes to include an IP address, there would be no end to litigation and the distinctions between the groups protected by certain statutes would be eroded. Congress’ decision to omit a new definition of PII in the 2013 amendment of the VPPA further emphasized that the VPPA “serves different purposes, and protects different constituencies, than other, broader privacy laws.”[9] For example, if “Google were to start purposefully leaking its customers’ YouTube video-watching histories,” the VPPA “would almost certainly” be violated.[10]

Extending the VPPA to regulate current technology would likely result in unlimited violations. Defining an IP address as PII within the context of the VPPA would mean that “the disclosure of an IP address to any Internet company with registered users might trigger “liability” when an IP address is regularly transmitted to an Internet service provider (ISP) with each search.[11] The court also pointed out that there is a spectrum of PII, with first and last name at one end, and an IP address at the other, lower, end of the spectrum, given that the IP address alone may be insufficient to identify a person. The case cited by the court to illustrate the need for a subpoena to identify a person is a copyright infringement case, Warner Bros. Records Inc. v. Walker, 704 F.Supp. 2d 460 (W.D. Pa. 2010). Warner Bros. needed a subpoena to identify the student who was assigned the IP address used to illegally download some songs. The student, who shared a room with multiple roommates, possibly would not have been identified without a subpoena, given that several people may have used the computer. It was not “reasonably” likely for Warner Bros. to identify the person responsible for the downloads without a subpoena. Understandably, a subpoena may be necessary in a fluid environment such as a college where multiple people may have access to a computer.

Time-out: It’s one thing for Warner Bros. to need help from the college to identify which of multiple people may have used an IP address assigned by the college. It’s something altogether different when Google, which the court describes as “a company whose entire business model is purportedly driven by the aggregation of information about Internet users,” wants to identify a person. The plaintiffs’ amicus very astutely provided some real-world perspective about what happens when Google wants to find out who you are: “concluding ‘that Google is unable to identify a user based on a combination of IP address … and other browser cookie data … would be like concluding the company that produces the phone book is unable to deduce the identity of an individual based on their telephone number.’”[12] Enough said. Resume play.

The court affirmed the dismissal of intrusion upon seclusion claim against Google: Although the court acknowledged that many people, and some courts, find the monetization and collection of data from children without parental consent repugnant, those acts alone, did not establish a claim for intrusion upon seclusion. Under New Jersey law, an intrusion upon seclusion claim occurs upon a showing that (i) an intentional intrusion (ii) upon the seclusion of another that is (iii) highly offensive to a reasonable person.[13] The court disregarded the fact that children, instead of adults, were tracked, because third-party cookies serve a legitimate commercial purpose for advertisers and Google uses them on the Nick Sites the same way it uses them on other non-child directed sites.

This is why Viacom may be liable for intrusion upon seclusion: When Viacom notified parents that it did not collect any personal information about children, it was reasonable for a jury to decide that parents may have permitted their children unsupervised access to the Nick Sites based on the disclaimer. If the parents of the plaintiff class members didn’t already have an expectation of privacy, Viacom’s notice created an expectation of privacy. Viacom’s violation of that trust by surreptitiously collecting data from children could be considered highly offensive under the applicable law.

Summary

COPPA and HIPAA V2An IP address has been likened to a computer’s fingerprint. If a statute identifies an IP address or other static assigned number as PII, that number is a great starting point to identify a user. For example, under COPPA and HIPAA, an IP address is as high on the spectrum of PII as a user’s first and last name. The rationale behind the ranking of an IP address in these statutes is that sometimes it is reasonable that an IP address can lead you to the user. Who’s looking for you also matters. It is reasonable that Google, using third-party cookies, can reasonably use your IP address to identify you.

Sometimes an IP address can only identify a computer, i.e., it cannot “reasonably” be used to identify you. Without a subpoena or some alternate means of creating a mosaicked identity, you may have to resort to battling “John Doe” until a subpoena grants you the right to retrieve additional information about the IP address. In these instances, IP addresses are not considered to be PII. At the end of the day, you have found a computer. Good job.

 

What did we learn?

  • Don’t over sell your privacy policy. Viacom faces potential liability because it violated its own privacy notice to parents.
  • Do the right thing. Don’t get information from children under the age of 13 that is defined as PII under any privacy law without parental consent. These days there are few things about which 90 percent of Americans agree — Viacom’s actions on the Nick Sites are considered to be highly offensive.
  • Now that you know that sometimes children’s browsing history, IP address and other information can be collected through third-party cookies without parental consent, educate your children. The Federal Trade Commission provides guidance on helping children distinguish between ads and entertainment: https://www.commonsensemedia.org/website-reviews/admongo.
  • Understand that a trade-off for having the world at your fingertips may mean sharing your computer’s fingerprint with inquiring minds.

Stay safe.


[1] In re Nickelodeon Consumer Privacy Litigation, 2016 WL 3513782 No. 15-1441 (3rd Cir. Jun. 28, 2016).

[2] Id. at * 4 (alleging that (1) targeting ads to children is more profitable than targeting ads to adults, in part, “because children are generally unable to distinguish between content and advertisements;” (2) 80% and 90% of 2,000 adult respondents, respectively, oppose an advertiser’s tracking of children and believed advertisers should obtain a parent’s permission before installing cookies on a device used by a minor child; and (3) companies can use “browser fingerprinting” to identify specific users).

[3] In re Hulu Privacy Litigation, No. 11-CV-3764 (LB), 2014 WL 1724344 (N.D. Cal. Apr. 28, 2014).

[4] In re Nickelodeon, 2016 WL 3513782 No. 15-1441 at * 3 (3rd Cir. Jun. 28, 2016).

[5] Id.

[6] Michael Dolan, “The Bork Tapes,” City Paper, Sept. 25–Oct. 1, 1987, at 1.

[7] In re Nickelodeon, 2016 WL 3513782 No. 15-1441 at * 15 (3rd Cir. Jun. 28, 2016).

[8] Id., at * 16-19.

[9] In re Nickelodeon, 2016 WL 3513782 No.: 15-1441 * 19 (3rd Cir. Jun. 28, 2016).

[10] Id. at * 17.

[11] Id. at * 20-21.

[12] Id. at * 20.

[13] Hennessy v. Coastal Eagle point Oil Co., 609 A.2d 11, 17 (N.J. 1992 (citing Restatment (Second) of Torts § 652B (1977)).

Reputation Matters: Don’t Lose Opportunities Due to Inaccurate Personal Data

Posted in Data Security, Privacy

“A reputation once broken may possibly be repaired, but the world will always keep their eyes on the spot where the crack was.” ― Joseph Hall

Consumers may be injured by inaccurate data that they cannot review or correct. There’s a hole in the bucket, dear Congress.[1] 

The children’s song, “There’s a Hole in the Bucket,” exemplifies the conundrum many consumers experience when they are denied opportunities or inappropriately solicited. Data brokers maintain files with over 1,500 pieces of pieces of personal data on each of us. There are over 3,500 data brokers in the U.S. Only about one-third of them permit individuals to “opt-out” of inclusion in their data banks, usually for a fee. Unless and until you recognize an unexplained pattern of lost job opportunities, rejected apartment applications or are targeted by unsolicited marketing, you may not care what data brokers maintain in your files.

Imagine this: You are 22 years old and gung-ho to use your brand-spanking new business organization degree as an entry-level traveling corporate trainer. You grant recruiters the right to conduct background checks after they indicate their interest in you based on your resume. You get rejection after rejection. You finally muster the courage to call a recruiter and ask why, and she explains that you are not a good fit based on background information that describes you as 39 years old, the parent of four young children, having a Ph.D. and a sufferer of agoraphobia. None of this information is true.

Your prospective employers may have relied on information provided by data brokers or credit rating agencies (CRA) in determining that you are not a viable candidate. Now that you know inaccurate information is being reported about you, you are confident that you can correct your files and the employers will reverse their decisions. You can if the inaccurate information is from a CRA. But if data brokers provided the incorrect information, you will find yourself in the miserable position of knowing your files are wrong and being powerless to correct them. You know prospective employers have considered inaccurate information about you, but you don’t know which employers relied on which data brokers or which inaccuracies in your files made you undesirable for hire. You don’t know how many data brokers have files on you or what evidence you can provide to disprove the inaccurate information about you. You and “Dear Henry” share the predicament of wanting to fix the hole in the bucket but lacking the tools to do so.

Let the screening begin.

Many decisions about consumers, job applicants and second dates are based on inaccurate information provided by data brokers. Data brokers sell consumers’ personally identifiable information (PII) to be used in marketing, people searches, fraud detection and risk management. The FTC defines data brokers as “companies that collect information, including personal information about consumers, from a wide variety of sources for the purpose of reselling such information to their customers for various purposes, including verifying an individual’s identity, differentiating records, marketing products, and preventing financial fraud.”[2]  The Fair Credit Reporting Act (FCRA) applies to CRAs like Experian, TransUnion and Equifax, not data brokers. CRAs must take reasonable steps to ensure the accuracy of the consumer PII they distribute and they must provide consumers the opportunity to review and correct their records. There are zero federal statutes or regulations affecting data brokers in this regard. If enacted as introduced in 2009, the Data Accountability and Trust Act (DATA) would provide procedures for individuals to audit and verify the accuracy of data held by data brokers. The swathe of data collected by data brokers is astounding and troubling. Add the fact that data brokers are generally less expensive to use than CRAs, and employers and individuals are at a distinct disadvantage relative to data brokers.

Here’s what’s in the bucket.

Reports about consumers are based on information showing what they own or use; who they are with, have lost or are fighting; how they make, save and spend their money; and what interests or ails them, including mental, genetic and “other” diseases that may be embarrassing.[3] For example, when you register your car, record a deed of trust, activate a warranty, join a Facebook group, fill a new prescription, or get sued, married, divorced or widowed, data brokers collect that information. It is tacitly understood that PII from data brokers is not accurate and enables discrimination in hiring, the provision of resources and opportunities.[4] Consumer advocacy groups report that information used in people search sites is not vetted — the consumer has the responsibility of figuring out which of 67 people named “Pamela Samuelson” authored Protecting Privacy Through Copyright Law?. Marketing information is more accurate, but is still unreliable. For example, a data broker may correctly report that a household member purchased a new car, but err by addressing car wash coupons to the resident third-grader. Risk mitigation information is the most accurate, because it is expected to at least correspond to the correct person, even if the results are outdated.

This brings to mind a character on a show who changed his name, because he shared it with well-known artist who was convicted of sex crimes. His new name, unfortunately, was shared with a well-known artist who was convicted of murder. How do you feel knowing that you may be judged by the bad report of someone who has a name similar to yours? The identities of entities using bad data may influence your answer.

Who’s looking in the bucket?

Financial institutions, government agencies, political organizers and insurance companies use the services of data brokers. As of May, one of the largest data broker’s customers included “47 Fortune 100 clients; 12 of the top 15 credit card issuers; seven of the top 10 retail banks; eight of the top 10 telecom/media companies; seven of the top 10 retailers; nine of the top 10 property and casualty insurers; three of the top five domestic airlines; and six of the top 10 U.S. hotels.”[5] How likely are you to recognize that after your namesake niece filed for bankruptcy that the hotel prices you were offered increased by 18%?

FactsCan you look in the bucket?

No. If data brokers filled the bucket, no federal law gives an individual the right to look in the bucket. A subpoena or other discovery procedure may be your best option to see your file. If a CRA filled the bucket, yes, an individual has the right to review and correct the information in the bucket.

What can you do?

  •  Educate yourself about your rights. See whether your state has any laws that offer you protection. California, for example, shields victims of violent crimes from having their PII publicized on the internet.
  • Opt out of as many of the data broker sites as is reasonable. Visit this website to get started: http://www.computerworld.com/article/2849263/doxxing-defense-remove-your-personal-info-from-data-brokers.html.
  •  Lobby your federal and state legislators and align yourself with organizations that advocate for the right to control your PII.

Stay safe.


[1] There’s a Hole in the Bucket, Songwriters: Harry Belafonte, Odetta Felious Gordon © Next Decade Entertainment, Inc.

[2] FTC, Protecting Consumer Privacy in an Era of Rapid Change, at 68 (Mar. 2012).

[3]Steve Kroft, The Data Brokers: Selling Your Personal Information, 60 Minutes, ep. “The Data Brokers” aired on CBS Mar. 9, 2014, http://www.cbsnews.com/news/the-data-brokers-selling-your-personal-information/.

[4] Exec. Office of the Pres., Big Data: Seizing Opportunities, Preserving Values, pp. 51-53, May 2014, http://www.whitehouse.gov/sites/default/files/docs/big_data_privacy_report_may_1_2014.pdf.

[5] U.S. Senate Commerce Committee, A Review of the Data Broker Industry: Collection, Use, and Sale of Consumer Data for Marketing Purposes  (December 2013).

Safe or Good? We All Have Choices to Make

Posted in Data Security

“He’s not safe, but he’s good.” (Referring to Aslan, the Lion, in The Lion, the Witch and the Wardrobe.) ― C.S. Lewis

I planned to write about the inspired, better-than-sliced-bread security option of using fingerprint authentication to protect our mobile devices. That imploded. In 2014, and earlier this year, courts in Virginia and California, respectively, issued warrants requiring suspects to provide their fingerprints to unlock phones so the government could access potentially incriminating evidence believed to be stored there.[1]

All “do,” no “talk.”

In contrast, courts have not forced individuals to reveal the passcodes used to secure their mobile devices.[2] What gives? Albert Gidari, the director of privacy at Stanford Law School’s Center for Internet and Society, explains that the Fifth Amendment protects thoughts, not things: “Unlike disclosing passcodes, you are not compelled to speak or say what’s ‘in your mind’ to law enforcement,” Gidari said. “‘Put your finger here’ is not testimonial or self-incriminating.” For example, you can be compelled to provide the key to your heart, but no one can make you reveal what is in your heart.

Why chain the door when all the windows are open?

The fingerprint authentication platform is only as good as its gaps. The maker of one of the top mobile operating systems has stored fingerprints as unencrypted images in local storage. Fingerprint data stored by two cellphone companies was breached despite the use of TrustZone, an encryption platform created by mobile manufacturers.[3] WhatsApp, which was mentioned in a previous blog, has also experienced data theft.

Studies reveal the ineffectiveness of security software provided to users by their data providers. The software is largely ineffective, because …  PEOPLE DON’T DOWNLOAD IT! People shred their mail, but don’t download the platforms devised to protect their privacy if it’s formatted as data. The most common reasons people don’t download updates include: (1) suspicion that the updates are malware sent by hackers; (2) belief that the update won’t benefit them if they are otherwise satisfied with their current service; (3) lack of understanding that the updates provide security patches; and (4) expectation that the updates will take too long or will use too much memory. The authenticity of updates to your operating systems can be checked by visiting the app store, your manufacturer’s website or conducting an internet search to find information about update releases.

A critical flaw in fingerprint authentication is hiding in plain sight.

The convenience and high-tech sexiness of using fingerprint authentication on our phones has clouded our judgment regarding some of the most basic things we know about security. Fingerprints have a characteristic that is counter to a cornerstone of cyber-security — Fingerprints are immutable. If someone steals your password, you can change it. Quarterly mandatory password expirations illustrate the adage, the best password, is a new password.[4] Heads would spin and roll in IT departments the world round if it were decreed that passwords will never be changed again.

And just like that, the floodgates are open.

A much touted advancement of fingerprint authentication is that no one can steal your fingerprint. That’s fine, but the image of your fingerprint can be stolen like any other image. The image of your fingerprint can give someone access to apps, browsers, photo albums, cloud files and online accounts, some of which may be secured by passwords cached in your phone history. Finally, does it make sense to have an expectation of privacy in our fingerprints? The legal answer is, no. Since we literally leave our fingerprints everywhere, maybe we should reconsider relying on them to secure our privacy. Our unspoken thoughts are inalienable property. Apparently, fingerprints are just keys.

Convenience is usually a good thing. Good things may not be safe. We each have to weigh whether the convenience of opening our phones with a finger to swipe instead of entering our PINs is worth the risk of losing our privacy.

Stay secure.


[1]Matt Hamilton, The government wants your fingerprint to unlock your phone. Should that be allowed?,  LA Times, Apr. 30, 2016, http://www.latimes.com/local/california/la-me-iphones-fingerprints-20160430-story.html; and Quinton Plummer, Virginia police can force you to unlock your smartphone using fingerprint: Here’s why, Tech Times, Nov. 3, 2014, http://www.techtimes.com/articles/19288/20141103/virginia-police-can-force-you-to-unlock-your-smartphone-using-fingerprint-heres-why.htm.
[2] SEC v. Huang, No. 15-269 (E.D. Pa. Sept. 23, 2015); Virginia v. Baust, No. CR14-1439 (Va. Cir. Oct. 28, 2014).
[3] Shubham Verma, Why I’m Not a Fan of Smartphones With Fingerprint Scanners, Gadgets360 Oct. 30, 2015, http://gadgets.ndtv.com/mobiles/opinion/why-im-not-a-fan-of-smartphones-with-fingerprint-scanners-759160.
[4] To be fair, there is some belief that changing passwords regularly is more harmful than not.  E.g., Andrea Peterson, Why changing your may do more harm than good, The Washington Post, Mar. 2, 2016, https://www.washingtonpost.com/news/the-switch/wp/2016/03/02/the-case-against-the-most-annoying-security-measure-virtually-every-workplace-uses/.

An Addendum to the Scariest Hack So Far

Posted in Privacy

Andy Johnson-Laird, President of Johnson-Laird, Inc., was kind enough to offer advice on three security techniques he recommends to detect and deny hackers, such as those we discussed in last week’s blog post.

  1. Intrusion Detection Systems identify external probing or port scanning of known IP addresses. He describes it as “the equivalent of lying in bed at night and listening for someone rattling your front door handle. It’s a more or less constant rattling when it comes to port scanning.” See https://en.wikipedia.org/wiki/Intrusion_detection_system.
  2.  Network Address Translation (NAT-ing) prevents the routing of internal IP addresses. Internal IP addresses will be selected from the range of “non-routable” addresses reserved by the Internet Engineering Task Force and the Internet Assigned Numbers Authority. See https://en.wikipedia.org/wiki/Reserved_IP_addresses.
  3. Internet Protocol Security (IPSec) is an end-to-end security scheme that requires mutual authentication at the beginning of communication and the negotiation of cryptographic keys during the session. See https://en.wikipedia.org/wiki/IPsec.

News Flash, May 12, 2016:

The Department of Justice proposed that the biometric database the FBI has been amassing for eight years should be exempt from privacy laws.[1] If approved, the proposal would free the FBI to save images of faces, handprints, tattoos, iris scans and biographies of people who don’t know of the existence of, let alone the content of, their files. The largely unvetted findings of investigations other federal agencies conducted on job applicants will also be included in the database. In a nutshell, anyone could be detrimentally affected based on inaccurate information and never know it. Stay tuned for updates on this developing proposal.


[1] See, Tim Cushing, Doesn’t Want Privacy Laws To Apply To Its Biometric Database from the and-doesn’t-want-to-let-citizens-know-how-THEIR-privacy-is-affected dept, TechDirt, May 12, 2016, https://www.techdirt.com/articles/20160508/13574834381/fbi-doesnt-want-privacy-laws-to-apply-to-biometric-database.shtml.

The Scariest Hack So Far

Posted in Data Security

Hackers have upped the ante. Data controllers wax fondly about the good old days when data was outright stolen. Back then, in 2013, there was a sense of fair play. Trolls did troll things. Assuming the victim implemented and maintained a “comprehensive information security program”[1] to protect the type of data that was compromised, its insurance carrier may have provided coverage and the issue was resolved. Now, ransomware, extortion and data sabotage may lead to ongoing issues for data controllers. Each of these types of cyberattacks is evolving in ways that are truly devious.

Data theft is to head cold as ransomware and extortion are to chicken pox.    

If the theft of your data is not on the scale experienced by Target, Wyndham or Living Social, your operations may recover in a small amount of time. Think of a localized data breach as a head cold. After the breach, you will be out of commission for a while, but if you take steps to avoid reinfection (implement a privacy policy), protect others from your symptoms (use firewalls and authentication) and complete your course of treatment (follow your plan and comply with breach notification laws), you will ultimately be fine.

Think of your computer network as patient zero. Ransomware is akin to the chicken pox — it hits fast, is contagious, and the signs of the illness can stay with you until you adopt an assumed name. Ransomware malware previously “only” locked your keyboard or uploaded unsavory files to your system. Attackers would notify you of the amount of Bitcoin required to regain access to your data or remove the offensive files. In 2013, hackers significantly increased their use of ransomware to (1) infect your system and, (2) install a cryptographic key to lock and unlock your data. Once in, the attackers would gauge whether to access your financial accounts directly or send a ransom demand with a countdown showing when your data would become permanently inaccessible. Now, ransomware such as CryptoWall spread and infect the shared drives that connect to patient zero. If your whole office is infected, a quarantine may be required until all viruses are eliminated.

Data sabotage initially seems to be an asymptomatic attack, but can quickly become fatal. 

Hackers using data sabotage can remain innocuous while they mine data. Only when hackers know enough about your data to cripple you or enrich themselves, is the true measure of their destructive nature realized.[2] Data sabotage may occur over a period of time and employ many distinct steps. Manipulation of the numbers reported in a Form 10-Q can cause a corporation’s stock to crash and affect an entire industry. Competitors may also find vulnerabilities in your security that they may exploit.

A case with espionage, extortion and pseudonyms is a sign of things to come.

Wire Swiss GmbH (Wire Swiss) is currently seeking a declaratory judgment and alleges civil extortion against its competitor, Quiet Riddle Ventures dba Open Whisper Systems, and Moxie Marlinspike.[3] The litigants develop end-to-end encrypted messaging software. Wire Swiss claims the defendants threatened to accuse Wire Swiss of infringing on copyrighted software code and publicize “vulnerabilities” in the security of Wire Swiss’ encryption software. Wire Swiss’ payment of a $2 million licensing fee would prevent the threatened action. Wire Swiss claims that the specter of publication of security vulnerabilities in its encryption software could cause catastrophic damage to its reputation. Wire Swiss further claims that the defendants’ threat coincided with the announcement that their Signal software had been incorporated into the WhatsApp messaging application. If true, the plaintiff’s allegations are a prime example of how data saboteurs profit from their hacks. This case may also be fodder for legislation to create a safe harbor for security self-evaluation.

The best policy may be to trust no one.

Developing a zero-trust, multilayer security plan may be your best method of protection. Here are some common tips that may help keep your data virus and hack free:

  • Encrypt or anonymize your data.
  • Erect firewalls.
  • Invest in “anti”— anti-virus, anti-malware and anti-spyware software.
  • Update your software regularly.
  • Consider using a “kill switch”— when suspicious events happen, the IT department should automatically be notified and the network should shut down if no protective measures are taken.
  • Ensure granular access control is used.
  • Regenerate session identification on every inquiry to you.
  • Use double or triple authentication.
  • Log errors instead of displaying them to potential hackers.
  • Revoke credentials when certain events occur.
  • Implement “eventing” so you know when certain categories of data are accessed and/or modified.

Sadly, no safeguard is guaranteed. Using multiple defenses will at a minimum, ensure you are not the slowest one running from the bear. Good luck, and may your houses remain pox-free.


[1] Fed. Trade Comm’n v. Wyndham Worldwide Corp., No. 2:13-cv-1887 * 4 (D.N.J. Dec. 9, 2015).
[2] See, Edmund Lee, “AP Twitter Account Hacked in Market-Moving Attack”, Apr. 24, 2013 http://www.bloomberg.com/news/articles/2013-04-23/dow-jones-drops-recovers-after-false-report-on-ap-twitter-page (S&P 500 Index dropped $136 billion in value following a fake tweet alleging Pres. Obama had been injured)
[3] Wire Swiss GmbH v. Quiet Riddle Ventures, LLC, et al., No. 2:16-cv-02340 (C.D. Cal. Apr. 6, 2016).  Mr. Marlinspike may (or may not) be Matthew Rosenfeld or Mike Benham.

Expectations of Privacy: Location Matters

Posted in Data Security, Privacy

Sometimes law enforcement needs a warrant to access cellphone data, sometimes a court order. Sometimes nothing is required.

 

Roaming while you roam.

Depending on where you use your cellphone, law enforcement may obtain your location records from your wireless provider without a court order or warrant — neither is required in Washington state. In urban areas where there are multiple cell towers, a phone’s location can be identified to within half a mile. Turning off your location services or powering down the cellphone alone will not shield you from law enforcement.

lemur

 

Are you where you said you’d be?

When processing data, cellphones communicate with the strongest available cell tower signal. Calls, texts and internet browsing generate cell-site location information (CSLI). CSLI is time-stamped and linked to the phone number. Cell towers emit different signals in each direction, so the phone’s movement is tracked by its angular position relative to a tower. From CSLI, law enforcement can track where you are; what cell phones are near you; with what phone numbers you communicate; how long you communicate; and what routes you travel. In essence, everything except the actual communication is recorded. If you aren’t a criminal, you may not care. Depending on how CSLI is produced, even the innocent can catch the attention of law enforcement.

 

Despite your innocence, law enforcement may receive your cellphone usage records.

“Tower dumps” entail producing CSLI for all cellphones that processed data sent through a tower. If your cellphone uses a targeted tower, your data may be captured. You go from having fun with your friends to law enforcement tracking you by your number.

sheep

Producing CSLI in real-time transfers your records contemporaneously as they are created by cellphone usage.  If this happens, you are probably having a bad day — this method may be used in exigent circumstances.  Collection of historical CSLI shows all CSLI for a cellphone for a specific period of time.

 

One person’s record is another person’s dirty laundry.

In U.S. v. Timothy Carpenter[1], the Sixth U.S. Circuit Court of Appeals joined the Eleventh Circuit[2] in ruling that law enforcement agencies do not need a warrant to track a caller’s location through cell tower records. Timothy Carpenter and Timothy Sanders robbed nine cellphone stores (ironic, isn’t it?) in Michigan and Ohio within four months.

cautionThe FBI requested the “transactional records” of the Timothys’ wireless providers, aka, the Timothys’ historical CSLI. Court orders were issued pursuant to the Stored Communications Act (SCA) after the FBI showed there were reasonable grounds to believe the CSLI was relevant to the investigation. The FBI reviewed 127 days of CSLI for one Timothy and 88 days for the other. The government established through the historical CSLI that the Timothys were located within a half-mile to two miles of each armed robbery when they occurred. On appeal, the defendants argued that the Fourth Amendment required the government to show probable cause and use a search warrant to access the CSLI.

 

The opinion focused on the following:

  1. There was no search, because the FBI collected the wireless providers’ data routing information, which was gathered in the ordinary course of business.
  2. CSLI does not refer to the content of the defendants’ private communications.
  3. Every cellphone user who has paid roaming fees knows that wireless carriers collect locational information, so there was no expectation of privacy.
  4. The CSLI is so comparatively imprecise compared to GPS, that there is no expectation that the cellphone user can be located exactly.
  5. The SCA requires the government to meet the “reasonable grounds” to get a court order, not the “probable cause” standard to obtain CSLI.

 

Tracking or stalking? Duration matters.

The concurring opinion in Carpenter questioned whether the business record standard of proof applies in the review of an alleged violation of Fourth Amendment rights. The rationale behind the question is that a business’s production of credit card records showing purchases, for example, may be sufficiently distinct from the production of cellphone records showing personal location to require a more stringent analysis. The concurring justice also found the scope of the location monitoring troubling. Lawfully tailing a suspect is one thing. Lawfully tailing a suspect for a period of three to four months transmutes the surveillance into the realm of privacy invasion.

 

So, how do I keep law enforcement out of my data?

iOS and Android operating systems and apps offer some protection of CSLI location data. Start by turning off the location services for all your existing apps. Download apps that discard the location data cached on your cellphone. Get and stay off the grid by using localized Wi-Fi connections. Rely on an offline map or app that anonymizes the cellphone, encrypts the location data and permanently deletes your data within a certain amount of time. Regularly monitor new technology used by law enforcement and cybersecurity experts.

husky

After all, when you build a better mousetrap, law enforcement will build a better mouse. Justice William O. Douglas called upon his Pacific Northwest ideals when he wrote, “The right to be let alone is indeed the beginning of all freedom.” Cheers to freedom.

 


[1] www.ca6.uscourts.gov/opinions.pdf/16a0089p-06.pdf
[2] https://www.eff.org/document/us-v-q-davis-opinion

Foundation for the Lost Boys and Girls of Sudan v. Alcon Entertainment (N.D. Georgia, March 23, 2016)

Posted in Copyright, Intellectual Property

Rarely does one think of copyright issues surrounding how research is conducted for feature films that are based on real life events, but 54 Sudanese refugees are forging that new connection between research and copyright in one lawsuit pending in the U.S. District Court of Georgia sitting in Atlanta. 

Those refugees, who survived starvation, disease and the militia attacks in Darfur, arrived in the U.S. and then sat down with film producer Robert Newmyer and screenwriter Margaret Nagle back in 2003 where they shared their life stories in a series of recorded interviews. The court found the 54 refugees to be joint authors of the script for the 2014 “The Good Lie” film starring Reese Witherspoon. The refugees created original expression in their oral telling of their stories that were fixed in tangible medium in the recorded interviews. “While some common elements of the Lost Boys’ story were publicly available, Newmyer and Nagle wanted to create a movie with real, personal and emotional details otherwise unavailable to the public at large,” says the lawsuit. “Newmyer and Nagle needed the details from the Lost Boys’ personal stories and permission to use those details in a screenplay and subsequent film. The title of The Good Lie, for instance, is said to refer to a lie once told by one of the refugees on the taped interview to would-be-captors that saved his life.

The defendants argued that the 54 refugees’ answers in the interview do not possess the modicum of creativity to constitute an original work of authorship under copyright law.  U.S. District Judge Leigh Martin May found otherwise in her recent 54-page ruling:

“The Interviews, however, did not consist merely of ‘ideas, facts and opinion made during a conversation,’ like the interviews by journalists in the cases Defendants cite,” responds May. “Rather, the Interviews were a creative process designed to create material for a screenplay and film. All that an ‘original work’ must possess is ‘some minimal degree of creativity’ … even a slight amount will suffice. Plaintiffs’ telling of their personal stories in response to questions designed to elicit material to create a fictional script for a feature film likely includes enough creativity to render the Interviews an original work of authorship.”

A major flaw in the case arose when it was shown that the 54 refugees do not have U. S. Copyright registrations of the taped interviews. The defendants argued that the case must be dismissed for the plaintiffs’ failure to obtain the requisite registrations; however, the court held that the failure to register was solely due to the defendants’ refusal to provide the 54 refugees with copies of the interviews. The defendants were found to have been the sole obstacle to the plaintiff having the tape recordings to satisfy the U.S. Copyright Office requirement that the tapes be submitted as specimens of the work at issue.

The court found that the umbrella organization, the Foundation for Lost Boys and Girls of Sudan, and the 54 refugees’ lawsuit may proceed based on copyright infringement, breach of the joint venture agreement, breach of fiduciary duty, conversion of plaintiffs’ ideas, breach of the covenant of good faith and fair dealing, unjust enrichment, promissory estoppel and other claims. The complaint alleges that Robert Newmyer and others orally promised the 54 refugees that the Foundation  would be one of the joint  members of the movie venture; and, that the Foundation would be the sole beneficiary of any fundraising efforts associated with the movie. The court held in its March 2016 opinion both that a jointly authored copyright arose in the recorded interviews; and also that an oral promise to be joint participants in the venture is enforceable notwithstanding Newmyer’s untimely death in 2005.

This lawsuit is noteworthy even beyond the allegations of the Hollywood exploitation of Sudanese refugees as it is an ambitious attempt at a declaratory judgment over authorship predicated upon tape recorded interviews. This is a case to watch.