The Flow Report - trust but verify with data supply

Many organizations evaluate data vendors based on information provided directly by those vendors - either in marketing materials, demos, or third-party marketplaces. As costs rise and business dependencies increase, however, that approach is not sufficient to create a durable data supply chain.

There are several challenges in relying on 1st party (vendor-provided) information and popularity to assess a data supplier:

Lack of transparency from vendors.

It is difficult to collect negative data points on vendors, for the simple reason that vendors (and even their clients) typically feel incentivized to focus on strengths. Whether a weakness is product-oriented (e.g., limited historical data, quality, etc.) or regulatory in nature, often a buyer cannot uncover or detect the issue without trial and error, or, in some cases, informal networking with peers. Serious problems can be extremely difficult to anticipate until they manifest - often after users have relied on the product or integrated it into their systems.

Crowd bias for popular products.

There's a certain logic to staying with the pack in vetting data vendors. Unfortunately, if something goes wrong - a failure to deliver, the loss of an underlying source, or even litigation - the wisdom of the crowd can prove to be cold comfort. That wisdom is only valuable if we can also assume that each organization has conducted reasonable diligence, yet for many vendor characteristics, there is not a shared understanding of what "reasonable" means.

Information becomes stale.

Businesses and their products change frequently, but organizations conduct diligence on vendors at one point in time. This means that organizations need a regular flow of new information that is updated at least as frequently as their vendors change. While this problem may seem obvious, it is difficult to solve. Regulatory requirements on vendor monitoring promulgated by the Securities Exchange Commission and the Federal Trade Commission can mean fines (or worse) in addition to this business dilemma.

How can one overcome these challenges and build a durable data supply chain?

Get a complete profile of each data source.

Organizations should track three categories of information for each data source they rely on: 1st party (vendor-provided), 2nd party (client-provided), and 3rd party (public/factual) information. This way, one gets a higher resolution image of the real product. For example, if a vendor has a “Trust and Assurance” page or advertises a GDPR certification, determine whether they have (inadvertently) delivered personal data to another client or been the defendant in a class action lawsuit.

Adhere to a written policy on data sourcing.

Regardless of how popular a data product seems, organizations should follow a consistent procedure or minimum standard in diligence and assessments of their data suppliers. This can protect one’s organization from crowd bias. Consider that popular vendors are (at least historically) more likely to attract regulatory enforcement or create widespread problems leading to negative press or even government intervention. Written policies on data sourcing are the most straightforward way to establish agreed-upon procedures, avoid internal clashes, and demonstrate appropriate controls to third parties (e.g., regulators, investors, clients, etc.).

Follow a regular cadence of assessing data sources.

Most organizations should aim to reassess their data supply chain more than once per year. Quarterly review is a good frequency that should be sustainable; for larger organizations with substantial data portfolios, quarterly reviews will require less time and effort if the organization also establishes a more frequent monitoring function, such as the service provided by Glacier.

Glacier helps organizations solve these problems by determining what to trust and verifying other information independently. Glacier uses a mix of vendor-sourced, public, and crowd-sourced information - reviewed by industry veterans - to evaluate vendors and periodic threats to their businesses. Contact Glacier to learn more.

 

©2024 Glacier Network LLC d/b/a Glacier Risk (“Glacier”). This post has been prepared by Glacier for informational purposes and is not legal, tax, or investment advice. This post is not intended to create, and receipt of it does not constitute, a lawyer-client relationship. This post was written by Don D’Amico without the use of generative AI tools. Don is the Founder & CEO of Glacier, a data risk company providing services to users of external and alternative data. Visit www.glaciernetwork.co to learn more.

Previous
Previous

The Flow Report - regulatory risk in algorithmic pricing

Next
Next

The Flow Report - monetize vs. minimize