Welcome to the Summer 2017 edition of Fenwick’s Privacy Bulletin. It’s an exciting time for our privacy and cybersecurity team.
In June, we welcomed industry leader James Koenig in our New York office. Jim is building an integrated team of lawyers, cross-discipline professionals and former industry privacy and cybersecurity officers to bolster the range of services we’re offering to our clients. Learn more about Jim’s background and our expanded team in our press release.
In addition, we bring you our latest bulletin below for the privacy and security developments and trends we think you should know more about.
The WannaCry ransomware1 attack that began on May 12 infected 230,000 computers in more than 150 countries within a few days. The scope of the attack was unprecedented—which is just one reason that companies need to identify preventive measures now.
WannaCry spread through an exploit called EternalBlue that infected Windows computer systems through a vulnerability in the Server Message Block protocol.2 WannaCry targets and encrypts 176 different file types, including Microsoft Office documents, database, multimedia and archive files. It initially demanded payment of $300 in Bitcoin in return for restoring access to the encrypted data. But WannaCry will increase the amount of ransom incrementally after a certain time limit.
WannaCry is a particularly dangerous strain of ransomware as it propagates without user interaction by scanning not only other computers on the network of an infected computer, but also over the internet to exploit the same vulnerability and infect other connected devices. A security researcher discovered a “kill switch” in WannaCry and registered a domain name for a DNS sinkhole3 that greatly impeded WannaCry’s spread.
The devastating effects of WannaCry demonstrate the urgent need for organizations to adopt preventive measures before their computer systems become infected with ransomware. One of the most important things that organizations can do is keep their systems updated with the latest software patches. Organizations that do not install the latest patches are much more vulnerable. Many of the computers infected by WannaCry either had not yet installed the Microsoft security update or were running an older version of Windows for which no updates had been released. Although other variants of WannaCry without the “kill switch” began to appear, the widespread application of the Windows security updates slowed the number of new infections to a trickle.
Organizations should consider limiting the number of services on their systems. Running unneeded services provides more ways for an attacker to exploit a vulnerability on your system. Disabling the SMB protocol on systems that do not require it, for example, would protect against the spread of WannaCry. Deploying the latest firewalls, intrusion prevent systems and antispam programs would also maximize the likelihood of preventing an infection.
As with other malware, ransomware infections typically occur through spam emails and other social engineering attacks. Organizations should educate employees to verify the legitimacy of an email before clicking on any of the links in the email or opening any files attached to the email.
Despite the preventive measures organizations may employ, a ransomware infection may still occur. Organizations should plan for such possibilities by backing up all of their critical data and systems on a regular basis. Since ransomware may infect all drives connected to a network, backups should be both offsite and offline. Just as backups should be segregated, other parts of a network should also be segregated from each other if possible. Network segmentation can help contain a malware infection and reduce its impact on the organization. For instance, in the case of WannaCry, blocking access to SMB ports on computers could potentially limit its reach.
Organizations that do suffer from a ransomware attack have limited options. Unless the ransomware that infects their system is a known strain with a publicly available decryption key, victimized organizations should focus their efforts on restoring critical data through their backups to return to operation as soon as possible. Moreover, if an organization’s network is segmented and the ransomware only affects certain parts of the network, the infected computer or computers should be isolated from the rest of the network to lessen the risk of further infection.
Barring these precautions, organizations must confront the difficult decision whether to pay the ransom demanded by an attacker. Law enforcement guidance states that organizations should not pay. Paying a ransom does not guarantee that an attacker will restore access to the encrypted data. Nor does it prevent the attacker from launching another attack against the same organization for a higher ransom. However, if an organization lacks a viable backup of the encrypted data and possesses an immediate need for access to this data (as with health care facilities), the organization may have no alternative but to simply pay the ransom and assume the risk that the attacker will restore access.
1Ransomware is malware that infects computer systems and denies users access to those infected systems until they pay a “ransom.”
2The SMB protocol is a network protocol used primarily for sharing access to files, printers and serial ports and communications between nodes on a network.
3A DNS sinkhole, also known as a sinkhole server, is a DNS server that gives out false information to prevent the use of a domain name. DNS stands for domain name system.
As voice recognition and facial scan technology has improved, organizations are increasingly employing the use of biometric identifiers in the authentication processes for devices and online applications and accounts. Surprisingly, there is no comprehensive federal statute or regulation governing the collection, protection, use or disposal of biometric data. The U.S. Federal Trade Commission has only issued recommended best practices for use of facial recognition, and not promulgated any rules. These best practices, however, are nonbinding and serve only as guidance. In addition, until recently, there have been only two states which have adopted laws regulating the use of biometric data—Illinois and Texas. In May 2017, Washington become the third state to enact a law governing the collection, use and retention of biometric data.
In 2008, Illinois passed the Biometric Information Privacy Act, which set forth a comprehensive set of rules for the collection and use of biometric data. Organizations must provide written notice prior to the collection of any biometric identifier. The notice must include the purpose of the collection and the duration that the organization will use or retain the data. Only after obtaining the written consent can organizations begin their collection activities. Once they have collected biometric data, the BIPA requires organizations to protect that data in the same manner it would protect other sensitive and confidential information using the reasonable standard of care in its industry. In addition, the BIPA requires organizations to have a publicly available written policy stating how long the organization will retain the data and rules governing the destruction of that data.
The BIPA prohibits organizations from selling or otherwise profiting from the biometric data they collect. It further prohibits organizations from disclosing biometric data unless (1) they obtain consent; (2) the disclosure completes a financial transaction requested by the individual; (3) the disclosure is required by federal, state or municipal law; or (5) the disclosure is required by a valid warrant or subpoena.
The BIPA provides a private right of action for violations of the statute and entitles a prevailing party to statutory damages for each violation equal to the greater of $1,000 or actual damages for negligent violations, and the greater of $5,000 or actual damages for intentional or reckless violations. The existence of the private right of action has led to the considerable litigation with Facebook, Google, Shutterfly and Snapchat over their use of facial scanning and/or recognition technology.
Texas enacted its own biometric data law shortly after the passage of the BIPA. Similar to the BIPA in many regards, the Texas law required informed consent by individuals before organizations could begin collecting biometric identifiers. However, the consent did not need to be written. The Texas biometric law also imposed limitations on the sale of biometric information and set forth security and retention requirements. Only the Texas Attorney General can enforce the state’s biometric law as the law does not provide for a private cause of action.
On May 16, 2017, Washington became the latest state to pass a law regulating biometric data effective as of July 23, 2017. The Washington statute defines “biometric identifiers” as “data generated by automatic measurements of an individual’s biological characteristics, such as a fingerprint, voiceprint, eye retinas, irises, or other unique biological patterns or characteristics that is used to identify a specific individual.” Significantly, perhaps in response to the litigation generated by the BIPA, Washington’s definition of “biometric identifiers” expressly excludes “physical or digital photograph, video or audio recording or data generated therefrom.” It also excludes information “collected, used, or stored for health care treatment, payment or operations” subject to HIPAA. The statute also possesses a security exception, exempting those parties that collect, enroll or store biometric identifiers in furtherance of a “security purpose.”
Washington’s biometric data law applies only to biometric identifiers that are “enrolled” in a commercial database, which is defined as “captur[ing] a biometric identifier of an individual, convert[ing] it into a reference template that cannot be reconstructed into the original output image and stor[ing] it in a database that matches the biometric identifier to a specific individual.” Organizations may not enroll a biometric identifier unless they provide notice and obtain consent. The statute does not require a specific type of notice. Instead, it states that notice is “context-dependent” and only needs to be “given through a procedure reasonably designed to be readily available to affected individuals.” The statue, however, specifically notes that “[n]otice… is not affirmative consent.”
Absent consent, an organization may not sell, lease or disclose biometric data to a third party for commercial purposes, except where a statutory exception applies. These exceptions include where necessary to provide a product or service requested by the individual and where disclosure is made to a third party “who contractually promises that the biometric identifier will not be further disclosed and will not be enrolled in a database for a commercial purpose” that is inconsistent with the law. Even with consent, organizations may not use the biometric data they collect for any purpose that is “materially inconsistent” with the original purpose of the collection.
The Washington biometric statute imposes security and retention requirements. Organizations must exercise reasonable care to guard against unauthorized access to and acquisition of biometric identifiers. They must also retain biometric identifiers for no longer than necessary to comply with the law, protect against fraud, criminal activity or other security threats, or provide the service for which the biometric identifier was collected.
Like the Texas biometric data law, the Washington biometric data law does not provide a private right of action. Only the Washington Attorney General can bring an action to enforce the statute under the Washington Consumer Protection Act.
In the absence of federal legislation, state laws regulating the collection, use and retention of biometric data appear to be imminent. Pending bills governing biometric data are currently pending in the Alaska, Connecticut, Montana and New Hampshire legislatures. Given the proliferation of biometric information as a means of identification and authentication, it is only a matter of time before more states adopt similar laws.
China’s broad new Cybersecurity Law, slated to go into effect on June 1, imposes sweeping data security requirements on network operators and critical information infrastructure providers. Enacted in November 2016 by the Standing Committee for the National People’s Congress of China, the law will be administered by the Cybersecurity Administration of China, the Public Security Bureau and the Ministry of Industry and Technology.
Network operators are defined as the owners or administrators of a network and network service providers. Networks refer to systems that consist of computers or other information terminals that collect, store, transmit, exchange and process information. The Cybersecurity Law requires network operators to provide technical support and assistance to public or national security agencies when conducting an investigation of a crime. Network operators are also required to adopt technical measures to monitor and record their network operations, and to preserve network logs for six months. Network operators are further required to adopt technical measures to prevent intrusions such as viruses, adopt measures such as data classification systems, and implement security measures, such as backup systems and encryption.
Critical information infrastructure is defined as infrastructure maintained by certain industry sectors which would seriously jeopardize national security and the public interest should such infrastructure malfunction, or be subject to damage or data breaches. These sectors include, but are not limited to, public communication and information services, energy, transportation, water, financial services, public service and e-government affairs. Network products and services procured by critical information infrastructure operators are subject to national security examination if such products or services are likely to affect “national security.” Critical network products and dedicated network security products are subject to mandatory national standards and need to be certified and approved before they can be sold or provided in China. Critical information infrastructure operators are also required to undergo a network safety assessment at least once a year.
The Cybersecurity Law subjects critical information infrastructure operators to data localization requirements under which they must retain, within the territory of China, critical and personal information which they collect and produce during their operations in China. Personal information is defined as all information that either singly or in combination with other information identifies a natural person, including but not limited to names, dates of birth, identification numbers, personal biometric information, addresses and telephone numbers. They may still be able to transmit this information overseas, but only after undergoing and passing a security review.
The Cybersecurity Law also provides certain protections for individuals. It prohibits network operators from providing an individual’s personal information to third parties without the individual’s consent, except in cases where the personal information is irreversibly depersonalized such that the data does not identify particular individuals. In general, consent needs to be obtained from the individual from whom personal information is collected when a third party processes it.
Individuals can request that a network operator delete personal information if he or she discovers that its collection or use is in violation of the new law or a contract between the parties. Individuals can also request that a network operator correct any personal information that is inaccurate.
The Cybersecurity Law imposes penalties for noncompliance, including warnings, suspensions of operations, imprisonment and fines of up to RMB 1 million. It also imposes penalties—such as freezing of assets—against foreign corporations or individuals who endanger critical information infrastructure. In addition, the Cybersecurity Law provides that civil liability may be incurred for any breach of any provision of the Cybersecurity Law which results in damages to a third party.
On April 11, the Cyberspace Administration of China released draft Measures for the Security Assessment of Outbound Transmission of Personal Information and Critical Data. On May 19, after receiving comments, the CAC released a revised draft measures. These provide additional detail on the restrictions for cross-border transfers and guidance on security assessments for data transfers.
The revised draft measures require not only critical information infrastructure operators but also network operators to store personal information and important data (defined as data closely related to national security, economic development and social and public interest) within the territory of China, unless there is a genuine and legitimate business need to transfer the data overseas. In the event of such a transfer, network operators are required to conduct a security assessment. Large scale transfers (i.e., transfers involving the personal data of more than 500,000 Chinese citizens), or transfers involving sensitive information, such as data concerning national defense or military, public health, and large-scale engineering projects, must be conducted before a regulatory authority. In addition to these security assessments, network operators transferring personal information must also conduct security reviews of their cross-border transfers at least annually and report the assessment to the appropriate regulatory authorities.
There are three instances where data transfers are expressly prohibited under the revised draft measures:
Even when an individual consents, network operators must notify the individual about the purpose, cope, content and recipient and country where the recipient of the transfer resides.
The revised draft measures were scheduled to go into effect together with the Cybersecurity Law on June 1, 2017. However, they contain a grace period until December 31, 2018 for network operators to comply with the cross-border transfer requirements.
Armen Nercessian and Mary Griffin*
Early this month, the U.S. Supreme Court added Carpenter v. United States to the roster for consideration in the upcoming October term. Carpenter will mark the Court’s first chance to address an important, as-yet unresolved question in the digital age: Does the Fourth Amendment require a warrant for law enforcement officials to obtain cell site location information, or CSLI, which reveal the location and movements of a cell phone user?
The case will address the tension between the Fourth Amendment and the Stored Communications Act, which Congress enacted as Title II of the Electronic Communications Privacy Act of 1986. The SCA specifies procedures that law enforcement may use to obtain certain records from third-party “electronic communication services” or “remote computing services.” But it does not require a warrant. Since its enactment, third-party service providers have routinely cooperated with law enforcement requests to disclose—subject to certain statutory requirements—customer data. And notably the petitioner here does not attack the constitutionality of the SCA. Rather, Carpenter asks whether companies should require a warrant, supported by particularized findings of probable cause, before disclosing CSLI. This question has caused considerable doubt among service providers, which must balance responding to law enforcement demands for information with the privacy interests of their customers, and which also require a clear roadmap about what the appropriate procedures are.
The uncertainty among service providers responding to requests for customer information under the SCA is exacerbated by the existence of a significant circuit split concerning whether the Fourth Amendment applies to CSLI. There have been no fewer than 18 separate majority, concurring and dissenting opinions across five circuit courts on the issue, and courts have fractured over whether there is any “reasonable expectation of privacy” in CSLI and other customer data. Carpenter implicates three different strains of Fourth Amendment jurisprudence: (1) the third party disclosure doctrine, (2) the physical trespass doctrine, and (3) the distinction between content and non-content information. The case will have the Court decide whether these doctrines, which first arose in the pre-digital world, still have continuing vitality today. And it will allow the Court to consider whether the accumulation of data by third-party service providers—now commonplace—gives rise to any new privacy interests under the Fourth Amendment.
In connection with the investigation of a series of armed robberies, federal prosecutors moved under the SCA for court orders requiring two cellular service providers to disclose 187 days of phone records, including CSLI, for petitioner Timothy Carpenter. Based on the CSLI, the government charged Carpenter with aiding and abetting robbery. Carpenter moved to suppress the evidence, but the district court rejected Carpenter’s argument and held that the government’s collection was not a Fourth Amendment “search.” On appeal, the Sixth Circuit affirmed, holding (1) that the records did not disclose the content of communications and thus were not entitled any Fourth Amendment protection; (2) that the disclosure of the records to third-party cellular providers defeated any “reasonable expectation of privacy” under the seminal case Katz v. United States, 389 U.S. 347 (1967); and (3) that the physical trespass doctrine—which the Supreme Court had revived in its recent Riley v. California, 134 S. Ct. 2473 (2014), and United States v. Jones, 565 U.S. 400 (2012), decisions—did not apply.
Concurring in the outcome on alternative grounds, one member on the panel, Judge Jane Branstetter Stranch, wrote separately to air her concerns about the Fourth Amendment tests that courts have applied in “this rapidly changing area of technology,” especially in light of “the sheer quantity of sensitive information procured without a warrant.”
Carpenter demonstrates the difficulty of applying the canonical tests under existing Fourth Amendment jurisprudence to the modern day. For example, there is the third party disclosure doctrine, which grows out of Katz’s “reasonable expectation of privacy” test. For someone to have a reasonable expectation of privacy in a piece of information, (1) that person must subjectively exhibit an expectation of privacy and (2) that expectation must be objectively reasonable. The core concept is that people have no reasonable expectation of privacy in any information they disclose to third parties, because they already subjectively surrendered any such expectation with the fact of disclosure. Where the doctrine applies, you cannot even get past the first step of the Katz framework, and Katz has remained black letter law on the books for half a century now. But in the digital age, where persons passively disclose so much information about themselves (and their whereabouts) to third parties at all times, what reasonable expectation of privacy could possibly be left?
Or take the related distinction that the Fourth Amendment marks between content information and non-content information, such as addressing. The idea here is that a person has no reasonable expectation of privacy in non-content information, because that is frequently disclosed, either to third-party service provider or to the public more broadly. Consider, for instance, a package sent through the mail: its contents are unknown and thus the sender has a reasonable expectation of privacy in that. But all other information about the package—the return and target address, the amount of postage on it, its size, shape, and weight—is ascertainable by any mail carrier or member of the public that comes into contact with it. And so there is no reasonable expectation of privacy in that kind of information. On balance, CSLI appears closer to what courts have traditionally considered addressing or other non-content information: it does not tell you what a person said or did, it just shows you where a person was.
Finally, there is the trespass theory of the Fourth Amendment, which the Supreme Court resurrected in its recent cases dealing with technology. In Jones, the Court held that the unauthorized placement of a GPS tracker on a car for long-term surveillance triggered Fourth Amendment protections. Similarly, in Riley, the Court held that law enforcement needed a warrant to search a mobile phone. But this trespass notion does not appear to have any place in Carpenter either. Police did not track Carpenter, or break into his cell phone; they merely asked for records from a third party who kept them.
None of these doctrines apply cleanly. Still, given the accumulation of information, there is still some visceral notion that the Fourth Amendment should apply here. The only question is how?
While the petitioner here did not request a full rejection of the third party disclosure doctrine, the Court may cull back on the third party disclosure doctrine. Chief Justice Roberts’s majority opinion in Riley suggested that persons still have some reasonable expectation of privacy in sensitive information collected over mobile phones and stored by service providers. Similarly, Justice Sotomayor’s concurrence in Jones warned against a strict application of the third party doctrine: “I would not assume that all information voluntarily disclosed to some member of the public for a limited purpose, is for that reason alone, disentitled to Fourth Amendment protection.” In both cases, the Court signaled that stringent adherence to Katz may stop making sense as technology evolves. But those cases both side-stepped the issue by instead turning to the doctrine of physical trespass, and that doctrine cannot sensibly apply to the facts of Carpenter.
It is also possible that the Court might create a new strain of jurisprudence based on the quantity of records requested. Such an approach would likely introduce certain issues of line-drawing, for instance, if a warrant is required for long-term tracking, while the SCA is sufficient for short-term. But, as Justice Samuel Anthony Alito’s concurrence in Jones and Judge Stranch’s concurrence in the Carpenter case point out, that might be appropriate. After all, in the modern era, it is not the disclosure of individual, isolated data points that seem problematic, but rather the accumulation of that data over time.
Which test will the Court apply? Service providers, and their customers, will have to wait until this October term to find out.*Mary Griffin is a summer associate in Fenwick's litigation group.