Blog Posts

SEC Enforcement Action Against App Annie a First Example of Regulatory Risks to Both Alternative Data Providers and Buyers

Sam Cheng

Product Manager

Nov 2, 2021

Interest in the alternative data market has been growing rapidly in the past few years, both from data providers and data buyers as well as now from the SEC. On September 14, 2021, the SEC announced that App Annie, an alternative data provider, and its former CEO and Chairman Bertrand Schmitt have agreed to settle an enforcement action against them for $10 million and $300,000, respectively. In the SEC order, App Annie misrepresented to its investment firm customers that data from their providers would be aggregated and anonymized before being shared in the paid product. This is the first enforcement action the SEC has brought against an alternative data provider.


“The SEC is clearly targeting data privacy practices. Data providers and buy-side firms alike have to pay scrupulous attention to data protection.”

– former SEC Commissioner Joseph A. Grundfest


The action is a significant signal for the agency’s continued scrutiny of the alternative data space and the need for data providers and buyers to manage regulatory risk.

The App Annie Settlement

App Annie is described in the SEC order as one of the largest sellers of mobile app performance data. They collect data by offering a free analytics product to companies to track how their apps are performing. As part of this agreement, App Annie stated in its Terms of Service that they would only use these confidential app performance metrics in aggregated and anonymized form.

App Annie monetizes this alternative data by offering an analytics product to investment firms that produces data analytics estimates on app performance metrics, such as revenue. They represented to their investment firm customers that the estimates were generated using statistical models from data that was aggregated and anonymized, in accordance with their commitment to the companies whose apps use App Annie’s analytics suite.

Despite these assurances, the SEC order describes a violation of the agreement between App Annie and the companies on the usage of the confidential app performance data. Instead of delivering only estimates of performance using statistical models based on aggregated and anonymized data, engineers at App Annie, at the direction of then CEO Bertrand Schmitt, used non-aggregated, non-anonymized data to make manual alterations to the estimates going to investment firms. This was done so that the estimates delivered to the investment firms would be closer to the actual app performance figures. This practice was not made known neither to App Annie’s app analytics customers nor to the data buyers making investment decisions using the altered estimates. It is worth noting that this practice was carried out, unknown to other executives, for multiple years between 2014-2018, despite the existence of an internal policy on usage of data from the analytics product.

Risk Mitigation with LeapYear

This story has been of particular interest to several of our customers, given that LeapYear works with companies to monetize their alternative data to investment firms. A key question that we often hear is how LeapYear can prevent such a misuse of sensitive data.

In the LeapYear model, analysts never have access to the raw, sensitive data. Instead of being handed the raw data or an anonymized version of the data, analysts make queries against the data through the LeapYear analytics platform. LeapYear uses the standard of Differential Privacy to ensure that the sensitive data cannot be reverse engineered using queries made through the platform. With this model, analysts are able to discover insights from the raw data and are protected from purposefully or inadvertently being exposed to any sensitive information. This approach makes assumptions neither on the trustworthiness of the analyst nor on the soundness of the anonymization methodology (which have been shown time and time again to fail).

In a case such as with App Annie, if the confidential data had been protected by LeapYear, running queries against it would have ensured that (1) only privacy protected statistics were released, and (2) entities in the data remained anonymous. The workflow that any results going to the investment firms must first pass through the LeapYear privacy layer blocks people from misusing the raw, non-anonymized data. In this way, LeapYear provides a rigorous, adaptive privacy protection that takes much of the burden of compliance and privacy protection off of human hands.


“Common approaches, such as de-identification or aggregation, often fail to protect confidentiality. LeapYear’s differential privacy solution offers powerful assurance against SEC liability because LeapYear’s customers can demonstrate they couldn’t possibly have accessed or disclosed PII, or other forms of confidential information.”

– former SEC Commissioner Joseph A. Grundfest


Not only is this architecture better for managing regulatory and privacy risk, it also allows for more sensitive data to be safely monetized because the platform assesses the privacy needs of each query. This is in contrast with traditional approaches such as de-identifying a dataset and then releasing it to an analyst, in which a significant amount of information must be removed from the data (and even still, privacy is not truly protected).

Key Takeaways

The SEC enforcement action is a signal for everyone in the alternative data space that the agency will continue to look deeper into regulatory compliance in this growing market. From the App Annie example there are key learnings to keep in mind if you are a data provider or a data buyer:

Direct data access puts both data providers and data buyers at risk.

  • Granting direct access to sensitive data places trust in the analyst that they will follow regulatory or contractual agreements. As we saw in the App Annie example, even when the “analyst” is someone internal to the data provider organization, trust in the analysts must be enforced.
  • Data buyers should also be wary of accepting direct access to data. With direct access, analysts are fully exposed to potential mishandling on the part of the data provider, e.g., if they used techniques such as redaction or “de-identification” to anonymize data, which demonstrably do not work. While in this case App Annie’s investment firm customers were not found of any misconduct, it is a risk that should not be taken if it doesn’t need to be.

Monetizing data without an experienced partner is risky.

  • In the case of App Annie, their primary business model is data monetization. Even though it is their main business, they did not have the right policies and diligence necessary to prevent this misuse of confidential data. Many companies that we speak with are looking to enter the alternative data market, but their main source of revenue will not be from data monetization. In these cases especially, we strongly recommend that they find a partner who is experienced in not only the aspect of securities regulation but also in data privacy considerations.

Policies and compliance diligence are necessary but not sufficient.

  • App Annie did have a policy limiting the usage of the app performance data, but not only was the policy inadequate (it didn’t consider public companies specifically, and it only specified revenue data), but also the SEC order found that it was not even properly enforced. From the perspective of the investment firms, compliance diligence can be passed as long as the data provider gives the answers needed to pass.
  • This is a good example of how processes and policies are insufficient to address regulatory and privacy risks. A technical solution can plug the holes that are opened or not addressed due to human error and misconduct, and it is one reason why LeapYear built a platform for privacy-preserving analytics.

To read more about how LeapYear protects data providers and data buyers, click here.

This website stores cookies on your computer to improve your website experience and provide more customized services to you. Read the cookie policy here.

I accept