A multi-sourcing operating model sharing the same data
Today, that “single set of data” (capital cash flows, income and expenditure, critical documents, investment monitoring KPIs, meta-data e.g., IRRs, Money Multiples, TVPI … budgets, forecasts …) is distributed across the participating parties. Today, the GP may have much of it, their Administrator some more, a Custodian or Depository a piece of it … The Investor gets what they ask for (if they are big enough) or what is published (if they are not).
The “single set of data” is fragmented, and the pieces are not accessible to all the participants. It is worse than this: pieces are duplicated (which really means they are “presumed to be duplicated” but there are differences which introduce problems). Copies of data become out-of-date or out-of-synch. Who maintains the “reference data” – “the source of truth”? The GP, the Administrator, the Custodian, the portfolio asset? How does the “source of truth” notify the other parties of change to underlying data?
It is hard enough to get the data flowing through one organisation to hit the “end points” which are used by multiple “consumers”. When external sources are involved the issues grow: different data formats, manual interventions to exchange data in spreadsheets, loss of provenance or lineage of the data (so can it still be trusted?) and so on.
On a small scale, teams can compensate for the shortcomings of fragmented data. But once an operation tries to scale, soon the problems become too large for overtime and discretionary effort on behalf of the critical personnel. Systems become the defining constraint on business growth.
Hiring more people is one route. Outsourcing problematic functions to a lower-cost location is another. But typically, the decision to outsource is based on input costs rather than output value: don’t fix the problem, just reduce the impact. Well, that may work for a while, but growth will still find you out and without addressing the core operational issue, the operational risk does not go away.
Big players are sustained by robust data models supported by appropriate tools. They recognise the advantages. They have removed obstacles to growth and reduced operational risk. They still have work to do offering interoperability with other participants. They may also have local issues getting different generations of “best of breed” applications to communicate. But the core model is there.
Smaller players are either in their sweet spot or constrained by their systems infrastructure.
There will be those firms, looking to break through to the next tier; pushing their fundraising, motivating their deal teams and building a data management problem for themselves, down the track.
Similar challenges arise for those firms extending their product lines by merging with other sub-asset class specialists. There will be few efficiency gains if the data management needs of the merged entity are not addressed.
There will come a time when all participants feed into and consume from a common, single data repository servicing the investment process. Whether that is hosted by a GP, a TPA or a new breed of service provider – a super-SaaS platform – that remains to be seen.