This week’s U.S. Securities and Exchange Commission enforcement cease-and-desist order (Order) In re App Annie Inc., out of the SEC’s San Francisco Regional Office, underscores the importance of taking meaningful steps to implement and abide by written policies on corporate data management and protection. The Order resolved fraud allegations that App Annie, an alternative data provider for the mobile app industry, and its co-founder and former CEO and Chairman Bertrand Schmitt (Schmitt), misused confidential data that App Annie had obtained by misrepresenting its data management and protection practices to its securities trading firm customers. The SEC ordered the company and Schmitt to pay a $10 million and a $300,000 civil fine, respectively, and barred Schmitt from serving as an officer or director of a public company for three years.
De-Identified Data Means De-Identified
Through a service called Connect, App Annie provides app analytics to companies that agree to give App Annie their app store credentials so that App Annie can collect the companies’ confidential (i.e., nonpublic) app performance data. App Annie represented to those companies that it would use the data only in aggregated, anonymized form to generate estimates of their apps’ performance for them.
Separately, through a service called Intelligence, App Annie sold confidential data about the estimated performance of apps to trading firms that sought to use the data to inform their investment decisions. Trading firms refer to the data as “alternative data” because of its nonpublic nature—it contains information that traditional, public sources of market data do not reflect.
In connection with selling Intelligence subscriptions to trading firms, App Annie represented that its app performance estimates derived from a statistical model using the aggregated, anonymized data obtained through Connect, and that Connect users consented to this manner of using Connect data. According to the Order, however, App Annie used the Connect data in non-aggregated, non-anonymized form to render more accurate (and, in turn, more valuable) estimates of app performance to sell through Intelligence. Thus, the Order found that App Annie misrepresented to its customers how it had obtained the data it was selling, which constituted violations of Section 10(b) of the Exchange Act of 1934 and Rule 10b-5 thereunder.
The Order also found that App Annie misrepresented to its trading-firm clients that it had internal policies and procedures in place to prevent misusing Connect data and selling material nonpublic information. Although App Annie had an internal data-management policy in place during the early relevant period, the Order emphasized that the company did not document any such policy until several years later. Even when the policy was documented, it failed to set appropriate limits on the use of Connect data, and the company failed to take steps to ensure proper implementation of the policy.
The SEC has other tools in its arsenal to police lax data-handling, most notably Regulation S-P, which requires that brokers, dealers, investment companies and investment advisers that are registered with the SEC have written policies to safeguard the financial and personal information of consumers. Other government entities have similar requirements, including, but by no means limited to, the New York State Department of Financial Services and Commonwealth of Massachusetts Securities Division. This growing collection of data regulations, which address data practices many might characterize as less egregious than those addressed in the Order, makes it essential that companies devote time and energy to devising appropriate data policies and take steps to put those policies into practice.
Four Data Governance Tips All Companies Should Heed
The SEC focused on App Annie’s lack of hardened policies around its handling of its customers’ confidential data, which allowed the company and Schmitt to engage in the “deceptive practices” described in the Order. The lessons for companies that collect data are clear and distill into the following four tips applicable to companies regardless of data:
- Design and implement sensible data management policies that address your unique business needs. Design policies you can live with, and do not overpromise critical or unique controls and restrictions that are not formalized.
- Develop a data classification policy or otherwise set different policies and data handling and security standards for different types of data. Increasingly, what used to be just the domain of varying security standards based on business criticality (e.g., role-based access controls, encryption, log monitoring or dual factor authentication), is not being extended to privacy and data analytics controls around collection, access, use and/or sharing of identifiable personal data (versus de-identified, pseudonymized or aggregated data).
- Regularly test your data collection and management policies. Customers, business partners, investors and acquirors are asking for more audits and assurance of compliance (e.g., assessment, SOC2 Type II, PCI, NIST and other certifications or standards).
- Do not make any gratuitous or misleading promises. Be transparent with the subjects of data that you collect about how you are handling it and abide by your representations.
Please contact Dave Feder, Tyler Newby, Jim Koenig, Mike Dicke, Marc Greco or any member of Fenwick’s Compliance Team and/or any member of Fenwick's Privacy & Cybersecurity Practice.