The financial industry is full of data that can bring value to businesses in a variety of ways.
- Businesses often buy data to make smarter decisions, but only if the value they get from the data is greater than the price they paid.
- Companies also sell data, but they need to understand the value of data so they can put a price on it.
- Investors need to value companies, but they need to determine the value of all of the company’s assets: its buildings, its machinery, its human capital, its workforce and its data.
Each situation poses its own complexities, but the fundamental question remains the same: how to value the data? Professor MIT Sloanand a team of researchers has created an approach to help answer this question.
“The last two decades have seen a huge increase in the use of data in different aspects of the economy, but we don’t know how to value this asset,” said Farboodi, a co-author of the recent article “Valuing financial data.”
“Right now, some of America’s largest companies are highly valued for the data they have, such as consumer data or production data.” This contributed to “a huge scatter” between their book value and market value, Farboodi said, explaining that “accounting rules do not allow book value to include data unless that data has been purchased”.
Data is valuable for businesses and investors; this decreases uncertainty by giving them more information to make decisions. Not knowing how to value the data has big implications for the economy because it leads to the wrong measure of both a company’s value and overall US GDP, she said.
To assign a monetary value to financial data, Farboodi and his co-authors relied on an existing framework called the rational expectations equilibrium model.
Covering the years 1985 to 2015, they used Institutional Broker Rating System analysts’ annual earnings forecasts for 5,506 companies to explore how investor valuations for standard data vary, depending on investor characteristics. . This data can be bought by investors, who use it to form their own forecasts or beliefs about what they think the returns will be.
The authors used a sufficient statistical approach to estimate the model. They constructed the expected return and volatility of different portfolios using regression analysis and the data series they wanted to assess. Then, they used those stats enough to measure the monetary value of the data set to investors.
The value of data falls when markets are illiquid
While all investors love data for its ability to reduce uncertainty, the authors found that when markets are illiquid and there is a lot of impact on prices, data is valued less as it becomes harder and more expensive to execute profitable trades, and the value of the financial data that informs those trades decreases.
“Data helps a financial firm execute profitable trades that others may not know about, but if the markets are illiquid, that moves the price against the firm, and the firm cannot use data from very efficiently,” said Farboodi. “Market illiquidity diminishes the value of data for all investors, for all assets, for all investment styles, and for all levels of wealth.”
This decline is “orders of magnitude larger for wealthier investors with large portfolios,” Farboodi said. “Big investors need to make bigger trades, so when the price moves against them, they lose a lot,” she said. “Data loses its value to them much more.”
And if data is an asset whose value is highly sensitive to liquidity, “it can lead to self-fulfilling cycles and financial fragility,” Farboodi said.
If companies have a large amount of data in their assets, “the company will become less valuable” when even a small negative shock makes markets illiquid. When companies become less valuable, investors who have those companies in their portfolios offload them because they are no longer valuable companies, which makes those companies even less valuable, and the cycle continues: markets become fragile.
A framework for measuring data
For policy makers, knowing how to properly assess financial data is crucial in determining how to design policies that regulate data and help determine whether and how much consumers should be paid for their data.
“The idea of thinking about a framework to measure data, to put numbers on that, is to be able to think about the magnitude of those forces,” Farboodi said. “If you want to design a policy, you have to have the value of these forces, and that’s what’s so lacking in the data literature.”
Going forward, Farboodi plans to build on its data research by exploring the value of data produced by businesses, based on its importance to society and the economy, versus value solely to the business. herself. Another idea will involve data brokers who sell anonymized customer data to companies that use it for their own purposes.
The research was authored by Farboodi; Laura Veldkampprofessor of finance at Columbia Business School; Dhruv Singal, PhD student at Columbia Business School; and Venky Venkateswaranprofessor of economics at New York University’s Stern School of Business.
Read next: Why liquid data is the new liquid gold