Within the General Partner there will be groups of people with a range of interests in the underlying asset; Fund Administration will have their servicing needs (capital flows and performance reporting, valuations…), Investor Relations will want to respond to the needs of the investors, Investment Officers will be looking at performance monitoring data, forecasts, and comparative data within the sector. Then Tax and Compliance and ESG departments will have their interests…
We could add Auditors, other third parties (external lenders), perhaps a Depositary Bank, Fiduciary bodies …
This is a story about multiple interested parties
This is a story about multiple interested parties, all contributing to the data pool or with a need for pieces or views of a single super-set of data; data related to a single asset, the investment vehicle owning that asset, and the structures which surround it (beneficial owners, nominees, carry partners, co-investors …)
Today that “single super-set of data” (capital cash flows, income and expenditure, critical documents, investment monitoring KPIs, meta-data…) is distributed across an asset servicing supply chain. Today, the GP may have much of that data on their infrastructure, their Administrator some more, the Custodian a piece of it … The Investor gets what they ask for (if they are big enough) or what is shared (if they are not).
Not Friction-free
The “single super-set of data” is fragmented, and the pieces are not readily accessible to accredited participants. Worse still, pieces are duplicated (which really means they are “presumed to be duplicated” but “copies” may not be authenticated copies and that introduces problems), copies of data become out of date or out-of-sync.
Who maintains the “master data” – “the source of truth”? The GP, the Administrator, the Custodian? How does the “source of truth” notify the other parties of changes to underlying data?
On a small scale, teams can compensate for the shortcomings of fragmented and distributed data. But once an enterprise tries to scale, the operational problems become too large for overtime and discretionary effort on behalf of the critical personnel. The data architecture and support systems become the defining constraint on business growth.
Hiring more people is one route. Outsourcing problematic functions to a lower-cost location is another. But typically, the decision to outsource is based on input costs rather than output value.
Don’t fix the problem, just reduce the direct cost.
Well, that may work for a while, but growth will still find you out and without addressing the core operational issue the attendant operational risk does not go away.
Is a low-cost operational failure any more comforting?
Big players get bigger (and stay bigger) sustained by robust data models supported by appropriate tools. They recognise the advantages. They have removed an obstacle to growth and reduced operational risk. They still have work to do offering interoperability with other participants. They may also have local issues getting different generations of “best of breed” applications to communicate. But the core model is there. There is a vision of convergence towards a data model which supports the data needs of all stakeholders.
“Enterprise Data Management” is a well-used but less understood phrase. In Private Markets, a Federated Data Management model is a more practical goal. Build out from those functional areas and supply-chain providers which have a degree of self-sufficiency. Link those islands together with appropriate tools and processes.