Data has always been crucial in private equity (PE), but the cost of underutilizing data is becoming harder to ignore. As artificial intelligence (AI) tools become more accessible and capable, firms that are investing in sophisticated data infrastructure are beginning to separate themselves from those still running on instinct and manual processes – differences that are increasingly reflected in performance. The need to modernize data capabilities is real, but the process is not straightforward. Many funds and portfolio companies operate with lean teams and data quality initiatives often compete with more urgent priorities. As firms work through these challenges, a more practical understanding is emerging of what approaches deliver value — and where efforts tend to fall short.
The data maturity gap
Since many firms are the first source of institutional capital into a business, they tend to encounter companies running on spreadsheets and basic enterprise resource planning (ERP) systems, with limited reporting infrastructure. Attempts to layer sophisticated analytics on top of those foundations can quickly run into a fundamental problem: poor data.
This is not a theoretical concern. Poor data quality at a portfolio company carries real consequences. When the operating team is not receiving timely, reliable information, issues can go unnoticed and compound over time. What appears to be a well-performing operation can deteriorate significantly because the underlying data failed to surface the warning signs early enough.
One tempting shortcut gaining traction is using AI to clean up messy data rather than addressing it at the source. The result is that AI amplifies what’s already in the data. If the underlying data is inconsistent or incomplete, advanced tooling on top of it tends to produce unreliable output.
Deal origination: Where AI and data is already delivering
Buyout firms routinely evaluate thousands of opportunities each year. The early-stage screening process, review of confidential information memorandums (CIMs), benchmarking against prior deals, and assessment of industry dynamics has traditionally consumed significant analyst time. AI tools have begun to compress that considerably. What previously required several hours of work per company can now be completed in a fraction of time, with outputs that are often standardized and easier to compare.
Broadening access to investment insights Another emerging application of AI is expanding access to analytical tools beyond the deal team to others involved in the investment committee process. When a broader group of decision-makers can independently review diligence materials and stress test assumptions before a company is formally presented, the quality of investment committee (IC) discussion often improves materially.
Unlocking deeper insights through data quality Data analysis during diligence is most impactful when the underlying company data can support it. While revenue concentration is usually visible early in the process, profit concentration is often more difficult to assess without reliable, well-structured data. As a result, important drivers of value — and risk — can remain obscured until later in the investment lifecycle, when they are more difficult to address.
Value creation: Where the real returns are
When it comes to portfolio value creation, the highest returns often come from targeted, well-scoped applications of AI rather than broad AI deployments, at least in the early stages.
One common pattern is a company that tracks data diligently but lacks the infrastructure to produce usable outputs. The underlying information exists, but the extraction and presentation do not. In these situations, even focused interventions, such as designing automated data pipelines and clean dashboards, can surface insights that management has never had meaningful access to before. Margin leakage caused by operational bottlenecks, overtime costs tied to certification gaps, pricing patterns that suggest systematic underpricing – these are the kinds of findings that translate directly and quickly into earnings before interest, taxes, depreciation and amortization (EBITDA) improvement.
More ambitious applications are emerging as well. Operators without traditional technical backgrounds are now building functional technology products using AI-assisted development tools, without relying on dedicated software engineering resources. Compliance-heavy processes that once required large teams and manual processes, are being redesigned around automation, compressing turnaround times from hours to seconds. These are not incremental improvements; they represent substantive changes in how portfolio companies can be run.
The broader implication is that AI is enabling faster, more accessible solution development and deployment. What once required significant time and coordination can now often begin with smaller, faster iterations led by business users. While these approaches can accelerate progress, they are most effective when paired with a broader strategy for building scalable, reliable data infrastructure.
The exit premium: Building buyer confidence
Strong data infrastructure does not necessarily command a direct valuation premium at exit, but it matters in a more fundamental way: it gives buyers conviction.
Conviction in the data, and by extension, in the sustainability of the revenue and margin profile, is ultimately what makes a business easier to acquire. A company that can clearly demonstrate its key performance indicator (KPI) trends, attribute its operational improvements to specific initiatives and present buyers with a transparent view of what they are purchasing reduces the uncertainty that buyers price into their offers.
The other exit-related value driver is actual EBITDA improvement. Automation, AI-enabled processes that enhance customer experience and data-driven operational improvements, all have a direct impact on the business’s financial profile. When those improvements are well documented and clearly attributable, they form the backbone of a credible value-creation narrative at exit.
Culture and adoption: It starts at the top
Meaningful technology adoption should begin with leadership. When firm leadership is actively using these tools and not merely endorsing them in the abstract, it changes the nature of the conversation.
The most effective approach is to give people access to tools and create space for experimentation. When a non-technical colleague independently builds a working application and demonstrates it to the team, the reaction is rarely skepticism and more often curiosity. That curiosity, once engaged, tends to translate quickly into practical applications across the organization.
Cross-portfolio knowledge sharing has also proven to be a meaningful lever. For example, quarterly convenings of technology leaders across portfolio companies and annual C-suite forums where companies share what is and isn’t working create constructive peer pressure.
One consistent finding on the portfolio side: reporting tools and dashboards fail when they are too complex. Designing for genuine simplicity, not just a simplified version of something inherently more complicated, creates a competitive advantage.
Where to focus now
The right investment in data and AI depends on where a given firm or portfolio company sits in its maturity journey and across a portfolio. That said, these approaches consistently deliver value:
Build a consistent reporting foundation: Standardized KPIs, automated data extracts and a reusable reporting model that feeds into clear, accessible dashboards, give management the tools to be proactive rather than perpetually reactive.
Start with the outcome you want, then work backward to identify the next step with the highest return: The most valuable investments are not always the most visible ones. Sometimes the right move is a relatively basic data infrastructure project that enables more sophisticated capabilities downstream.
Assign clear owners: Organizations that put specific people and resources behind the effort, like evaluating use cases, building tools, making smart build‑versus‑buy decisions and keeping governance aligned with growing needs move faster and more effectively.
How we can help
At Baker Tilly, we help PE firms build the data and AI capabilities that drive real value, from foundational data cleanup to advanced analytics and automation. We work across the company to establish unified reporting, streamline data pipelines, identify high-return AI use cases and build practical tools that improve decision‑making. Our approach meets organizations where they are to help them move faster, with clearer insights and stronger conviction.