Friction-free Data Management in Private Markets

Purchase a property, rent it out, and earn income from the rent. Simple, right?

What vehicle is used to acquire the asset? Already, the story includes investors (as partners in a partnership fund), co-investors (in the case of a Special Purpose Vehicle), regulators (depending on the domicile).

Is this vehicle self-administered, or is a Third-Party Administrator (TPA) involved – likely already part of your “circle of trust”? A custody function is probably in place, potentially including sub-custodians across multiple jurisdictions.

Within the General Partner (GP), teams have distinct priorities. Fund administration focuses on servicing needs, such as capital flows, performance reporting, and valuations. Investor relations manages investor expectations and communications. Investment officers focus on performance monitoring data, forecasts, and comparative data within the sector. Meanwhile, tax, compliance and ESG departments bring distinct priorities. And don’t forget, auditors, external lenders, depositary banks, and fiduciary bodies, which might also be involved.

In summary, this is a story about multiple interested parties – each contributing to, or needing access to certain parts of a single super-set of data: data related to a single asset, the investment vehicle owning that asset, and the structures which surround it (beneficial owners, nominees, carry partners, co-investors).

Today this “single super-set of data” – capital cash flows, income and expenditure, critical documents, investment monitoring KPIs, metadata – is distributed across the asset servicing supply chain. The GP may currently have much of the data on their infrastructure, the administrator another portion, and the custodian a further part. Regardless, the investor receives what they request – if influential or what is generally available otherwise.

Far from frictionless, the notion of a “single super-set of data” is undermined by fragmented, inaccessible data sets to accredited participants. The situation is further complicated by data duplication – often assumed rather than verified – where copies may not be authenticated, leading to inconsistencies. Unverified duplicates can quickly become outdated or inconsistent. This raises a critical question: who maintains the master data – “the source of truth”? Is it the General Partner, the administrator, or the custodian? And once changes to the underlying data occur, how is this communicated to all relevant parties?

At a small scale, teams can compensate for fragmented data. But, as the enterprise scales, operational challenges soon exceed what can be managed through overtime or individual effort. At this point, system and data architecture limitations become the key constraint on business growth.

Options include hiring more staff or outsourcing problematic functions to lower-cost locations. Yet outsourcing often prioritises cost over value – cutting direct expenses without solving the underlying issue. This approach may offer temporary relief, but as the business grows, unresolved operational inefficiencies will resurface. And while low-cost failures may be cheaper, they are no less damaging. Without addressing the core operational challenges, the associated risks remain firmly in place.

Established players grow through robust data models and the appropriate tools. They understand the strategic advantages these bring – removing barriers to growth and reducing operational risk. While challenges remain, such as achieving interoperability with other participants and integrating multiple generations of “best-of-breed” applications, the core model exists. There is a vision of convergence towards a data model, which supports the data needs of all stakeholders.

“Enterprise Data Management” is a commonly used but often misunderstood phrase. In private markets, a federated data management model is a more achievable goal. The strategy starts by strengthening data capabilities in self-sufficient functional areas and supply-chain providers. These “islands” can then be connected through the right tools and processes, creating a cohesive and scalable data ecosystem.

Eventually, all participants will contribute to and draw from what appears – at least to the end-user – as a unified data repository for the investment lifecycle.

Whether hosted by a GP, a TPA, or a new generation service provider (e.g. a super-SaaS platform, or Manco) remains to be seen. Its emergence is inevitable, and it will define the larger, more resilient players in private markets.

A recent report – ‘Overcoming Fractured Data Chains and Achieving Operational Brilliance in Private Markets’ – by Celent™ – a research and advisory firm focussed on technology for financial institutions – explores these challenges and the need for operational and data maturity. We see the ServiceNow™ model as a key enabler in achieving this objective.

 

     is a registered trademark of ServiceNow Inc.

    is a registered trademark of Celent, Inc. and/or its affiliates

ServiceNow™ Data Fabric and Customer Service Management (CSM), supported by  native workflows, provide the technology to address many of these operational challenges.

 

By Bhavin Shah, Head of ServiceNow Practice.