Data Monetization Case Studies: What Not To Do

Sam Cheng

| Product Manager

November 2, 2021

Alternative data markets have been grown rapidly. This has attracted interest from both providers and buyers of alternative data sets.

Now the SEC is interested too.

The enforcement action by the SEC against App Annie, the first against an alternative data provider, is a data monetization case study in what not to do.

On September 14, 2021, the SEC announced that App Annie, an alternative data provider, and its former CEO and Chairman Bertrand Schmitt agreed to settle an enforcement action against them for $10 million and $300,000, respectively. According to the SEC order, App Annie misrepresented that they would aggregate and anonymize data from their providers before sharing with investment firms interested in alternative datasets.

“The SEC is clearly targeting data privacy practices. Data providers and buy-side firms alike have to pay scrupulous attention to data protection.”

– former SEC Commissioner Joseph A. Grundfest

The action signals an escalation in the agency’s scrutiny of the alternative data market and serves as a warning to providers and buyers of alternative datasets to only share sensitive data with extreme care.

Too easy to switch off data aggregation and data anonymization

App Annie is described in the SEC order as one of the largest sellers of mobile app performance data. They collected data by offering a free analytics product to companies to track how their apps are performing. As part of this agreement, App Annie stated in its terms of service that they would only use these confidential app performance metrics after data aggregation and data anonymization.

The monetization solution offered by App Annie was an analytics product sold to investment firms, which included estimates of app performance metrics, such as revenue. They represented to their investment-firm customers that the estimates were generated using statistical models from aggregated and anonymized data. This also fulfilled their commitment to the companies providing information for data monetization.

Despite these assurances, the SEC order describes a violation of the agreement between App Annie and the companies on the usage of the confidential app performance data.

At the direction of then CEO Bertrand Schmitt, engineers at App Annie, used non-aggregated, non-anonymized data to make manual alterations to the estimates going to investment firms. This violated data monetization commitments to deliver only estimates of performance modeled on aggregated and anonymized data. The motive for turning off data aggregation and data anonymization was to give the buyers of the monetization solution estimates closer to the actual app performance figures. This practice was not made known neither to App Annie’s app analytics customers nor to the data buyers making investment decisions using the altered estimates. Notably this mispractice on how to handle sensitive data, unknown to other executives, continued between 2014-2018. All of this happened, despite express internal policies on usage of data from the analytics product.

How to monetize data safely

LeapYear works with companies to monetize their alternative data to investment firms so this story has particularly interested several of our customers. The question we hear most often: How can LeapYear prevent sensitive data exposure?

With LeapYear, analysts can never access the raw, sensitive data. Instead of being handed the raw data or an anonymized version of the data, analysts make queries against the data through the LeapYear analytics platform. Differential Privacy is the standard underpinning LeapYear’s software. This means that the sensitive data cannot be reverse engineered using queries made through the platform. With this model, analysts are able to discover insights and extract the analytical value, but cannot purposefully or inadvertently see any sensitive information. Differential privacy, as implemented by LeapYear, makes no assumptions about the trustworthiness of the analyst or the soundness of the anonymization methodology (which has been shown time and time again to fail).

Data monetization case studies, such as App Annie, would have not happened with LeapYear.

With LeapYear protecting confidential data, running queries against it would have ensured that (1) only privacy protected statistics were released, and (2) entities in the data remained anonymous. With a workflow that any results going to the investment firms must first pass through the LeapYear privacy layer blocks people from misusing the raw, non-anonymized data. In this way, LeapYear provides a rigorous, adaptive privacy protection that takes much of the burden of compliance and privacy protection off of human hands. This is privacy by design.

“Common approaches, such as de-identification or aggregation, often fail to protect confidentiality. LeapYear’s differential privacy solution offers powerful assurance against SEC liability because LeapYear’s customers can demonstrate they couldn’t possibly have accessed or disclosed PII, or other forms of confidential information.”

– former SEC Commissioner Joseph A. Grundfest

Not only is this architecture better for managing regulatory and privacy risk, it also grows the amount of value of sensitive data that can be monetized safely, as the platform assesses the privacy needs of each query. This contrasts with traditional approaches such as de-identifying a dataset and then releasing it to an analyst. This approach requires removing significant amounts of information before release — even then, privacy protection remains fragile.

Lessons from this data monetization case study

The SEC enforcement action signals to everyone in the alternative data market that the agency will continue to look deeper into regulatory compliance in this growing market. From the App Annie example there are lessons to learn if you are a data provider or a data buyer:

Data monetization solutions should not provide direct data access

  • Granting direct access to sensitive data places trust in the analyst that they will follow regulatory or contractual agreements. As we saw in the App Annie example, even when the “analyst” is someone internal to the data provider organization, trust in the analysts must be enforced.
  • Data buyers should also be wary of accepting direct access to data. With direct access, analysts are fully exposed to potential mishandling on the part of the data provider, e.g., if they used techniques such as redaction or “de-identification” to anonymize data, which demonstrably do not work. While in this case App Annie’s investment firm customers were not found of any misconduct, it is a risk that should not be taken if it doesn’t need to be.

Work with experts in the data monetization market

  • In the case of App Annie, their primary business model is data monetization. Even though it is their main business, they did not have the right policies and diligence necessary to prevent this misuse of confidential data. Many companies that we speak with are looking to enter the alternative data market, but their main source of revenue will not be from data monetization. In these cases especially, we strongly recommend that they find a partner who is experienced in not only the aspect of securities regulation but also in data privacy considerations.

Prevent sensitive date exposure with both technology and policy

  • App Annie did have a policy limiting the usage of the app performance data, but not only was the policy inadequate (it didn’t consider public companies specifically, and it only specified revenue data), but also the SEC order found that it was not even properly enforced. From the perspective of the investment firms, compliance diligence can be passed as long as the data provider gives the answers needed to pass.
  • This illustrates how processes and policies are insufficient to address regulatory and privacy risks. A technical solution can plug the holes that are opened or not addressed due to human error and misconduct, and it is one reason why LeapYear built a platform for privacy-preserving analytics.

To read more about how LeapYear protects data providers and data buyers, click here.

This website stores cookies on your computer to improve your website experience and provide more customized services to you. Read the cookie policy here.

I accept