Wake up to the real costs of data management

Dan Groman, chief technology officer, ENFUSION 

Chief operating officers, heads of operations, and other operational leaders can no longer afford to treat data management as the domain of technologists. This imperative applies whether you are leading a small hedge fund or a large institutional asset management firm. The time is now to become data literate, even if database experts remain responsible for handling the technical details of data management.

How does this affect the way you operate? Data impacts your costs and determines your ability to grow. Especially in the past few years, data has proliferated, from front to back office. Much of the investment decision-making process consists of utilising data from various data sources—data from markets, data about your assets and portfolios, derived and third-party data, and the data that resides in your books and records.

If you find yourself thinking, “yes, of course,” to this data-oriented view, you should also ask whether you have fully integrated data management thinking into your operations. 

Workarounds work until they don’t. In Enfusion’s experience, most investment managers are only beginning to tackle profound operational change. The impacts are significant. Beyond the usual cost-of-ownership metrics, the real costs of ineffective data management show up in the returns you can deliver to investors. Needless data tangles can create operational drag that contribute to trade slippage and degrade a fund’s performance.

Leaders in an investment management firm must get the firm to align around three critical business goals for investment management data:

  • Deliver on the promise of all the data at your fingertips by optimising data design
  • Dislodge hidden alpha that is obscured by noisy data and fuzzy insights
  • Drive out unnecessary operational complexity that  hampers workflows

Achieving these goals means adhering to three core data management principles.

PRINCIPLE 1

Data design is everything

There’s one  major reason why you  should  care about data design. Every time data moves from one format or system to another, the risks to accuracy and timeliness increase. For example, suppose your IBOR and ABOR reside in two separate systems.

Start-of-day and end-of-day position data can easily drift out of sync because different groups of people with different ways of working are maintaining multiple data systems. Or, consider juggling data across a wide range of asset classes and strategies, all supported by separate systems. You see these disconnects all the time when looking at your operational reality.

In an ideal world, there would be no seams or hand-offs between planning and making portfolio decisions, placing, completing, and booking orders, compliance, administrator reconciliations, and back-office accounting.

Coherent data design makes that ideal a reality. From portfolio construction to general ledgers and NAV calculations, every aspect of your operations, from front office to back office, can pull from a single source of data. That single source represents a single and accurate view of the truth. Portfolio managers, traders, middle-office personnel, and finance operations teams aren’t working with different facts. Your data becomes an operational “gold standard.”

Without the right data design, disconnects need to be sewn together with either technology or people solutions. But every seam is a risk. When under operational stress, seams can easily tear.

PRINCIPLE  2

The noisier your data, the fuzzier your insights

In a hyper-competitive marketplace, the bar for alpha continues to rise, not just investment alpha but operational alpha as well. Creating alpha requires a constant stream of lightning-speed decisions predicated on a wealth of detail that resides in data.

Here’s why you should care about noise in your data—it is all too easy to lose track of the game plan if noise drowns out meaningful signals. That noise is particularly likely when you lack an overarching data design, constantly translating between disparate systems, and losing clarity at interface points. At that point, you end up with bad inputs that affect calculations, leading to fuzzy and ill-formed conclusions. It can be quite challenging to figure out whether issues come from bad inputs, flawed assumptions, or calculation errors.

The same situation emerges with broader operational data and cost of ownership. Multiple data sources, convoluted processes for updating and maintaining multiple systems, unnecessary workarounds, and data quality issues all serve to magnify noise. In other words, there is a risk of losing the nuances where opportunities to improve operational alpha are hiding.

FIVE SOURCES OF HIDDEN OPERATIONAL ALPHA

 Trading against accurate intraday cash versus end-of-day updates
 Better, quicker optionality in response to voluntary corporate actions
 Optimising allocation decisions by evaluating the performance of your third parties
 Eliminating delays and workarounds across multiple security masters
 Real-time views of your positions and weightswith instant synchronisation between your Portfolio Management System (PMS) and Order and Execution Management System (OEMS)

The solution? Take an approach to data management that allows everyone to use the same data in real time, across multiple funds and a rich range of asset classes, and with built-in controls to minimise human error and data degradation across numerous systems, each with its own logic and data models.

PRINCIPLE 3

Data management is ultimately about workflow

If you drill down into most issues with inefficient or risk-prone workflows, data management tends to be a significant root cause. This is where data management and operational leadership meet. You can often detect data management issues specifically by looking at your investment management workflows. You don’t have to be a data architect to find the problems. In turn, by addressing those data management issues, you can improve the workflows that data supports.

The start of each trading day for a global portfolio provides a case in point. Once you determine when your trading day begins, it can be challenging to roll trading from one market to the next and maintain accurate records. As the trading day for each market picks up speed, the entire team benefits from access to high-quality data to support investment opportunities. If data resides on disparate systems, each one needs to reflect positions and cash in real-time so that traders can start trading as markets around the world open.

Conversely, with one shared global system and single source of data, you have confidence in knowing where things stand and what your investment team has available to trade. Accurate, real-time analytics can then help examine and optimise activity.

When everyone is rowing in the same direction, with the front, middle and back office all viewing the same data, then workflow becomes more coherent, faster, and less prone to risk. By contrast, those who are unlucky enough to have disconnected systems are rowing on different ships entirely. As the day wears on, conditions go from bad to worse, going further off course over the hours.

Conclusion

Design, consistency, and efficiency provide the ultimate measures of the real cost of ownership of data sources, platforms, and service providers. That’s why operational leaders need to keep data management at the top of their agenda.

The wrong decisions about data have significant negative consequences. Technology costs go up while operating margin shrinks because suboptimal design impedes workflow. Moreover, it impinges on fund performance and the value you can deliver to investors.

But with the right decisions about data, all the workarounds that you’ve learned to live with become unnecessary. You have a platform for investment management operations that, by design, circumvents the need to stitch together disconnected systems. And with better design comes better workflow, improving your operational alpha and, ultimately, your fund performance.

Dan Groman is chief technology officer at Enfusion. He is responsible for the technology strategy, development, and day-to-day operations of Enfusion’s global client technology solutions. He is also responsible for the company’s information technology and security as well as enterprise application support, leading teams distributed across the company’s global campus of eight offices. Groman joined Enfusion in 2016 and led the proof of concept for the firm’s visual analytics platform. He was appointed to his current role in March 2020. Prior to joining Enfusion, he served as a systems engineer at CME Group. He began his career in asset management and hedge funds in debt and middle/back-office operations, holding development positions at Northern Trust Hedge Fund Services and King Street Capital Management. Groman holds a bachelor’s degree in Economics from the University of Maryland and a master’s degree in Computer Science from DePaul University.