November 11, 2025
DXC pioneers AI Mesh
by Josef A. Habdank, AI & Data Engineering Global Practice Lead, DXC Technology
November 11, 2025
by Josef A. Habdank, AI & Data Engineering Global Practice Lead, DXC Technology
Data-driven enterprise connectivity isn't just about linking systems. It's about utilizing them to extract value from vast, distributed data landscapes without overcomplicating the issue.
Traditional approaches, such as centralized data warehouses or monolithic ETL pipelines, often fall short, creating silos, inefficiency and excess effort for marginal gains.
Complementary architectures, Data Mesh and AI Mesh, form a robust framework for enterprise data management.
Data Mesh handles high-volume, high-frequency data flows through decentralized, product-oriented structures. AI Mesh overlays a network of intelligent agents to tackle low-volume, expedient needs. The hybrid model resolves distributed systems where centralized governance meets decentralized execution.
This blog draws on the established principles and emerging trends of 2025 AI-augmented data ecosystems. We'll determine how they address synergies and real-world challenges, and why this combination could redefine enterprise connectivity.
Data Mesh is a decentralized data architecture that shifts from centralized data teams to domain-oriented ownership. Pioneered by Zhamak Dehghani and popularized through works like Martin Fowler's 2020 article, Data Mesh treats data as a product managed by cross-functional domain teams.
Data flows from sources (producers) through aligned data products (source-aligned, aggregated and consumer-aligned) to consumers, like apps or dashboards. This mesh-like structure connects silos, enabling scalable analytics and AI applications. Companies like AWS, IBM and Snowflake have adopted variations, emphasizing how Data Mesh solves security challenges through distributed ownership.
In practice, Data Mesh excels for high-volume, high-frequency tables, e.g., daily gigabytes or terabytes of transaction data in e-commerce. Tools like Databricks and Confluent support this by facilitating streaming and domain-specific pipelines, often AI-generated for enhanced efficiency.
Not all data fits neatly into a classic Data Mesh flow. Enterprises often deal with a long tail of low-frequency, low-volume tables; kilobytes of niche data like regulatory reports or impromptu queries. Integrating these via ETLs or virtual tables requires significant effort, involving tasks like schema mapping, pipeline setup, testing and maintenance.
The cost-benefit ratio skews heavily, as the cost of creating single data flow is very similar, regardless of whether or not it processes kilobytes of gigabytes of data daily.
Research from 2025 highlights this: Many tables remain underutilized due to integration overheads, leading to data debt and missed opportunities. This is particularly compounded for AI-driven enterprises; models require diverse data, but siloed, low-volume sources hinder both training and inference.
AI Mesh extends Data Mesh principles to AI systems, creating a distributed network of AI agents that dynamically access underlying sources. Unlike static pipelines, AI Mesh leverages agentic AI (autonomous agents that reason, plan and execute tasks) and, as Figure 4 illustrates, it focuses on unifying control, discovery and monitoring upstream and downstream AI systems (often with vendors like Snowflake Copilot).
AI Mesh addresses enterprise needs for distributed intelligence. Tools like agent meshes optimize AI communication, while agentic data management (provided by Informatica, Snowflake, etc.) automates governance. Agents can predict issues, automate quality checks and integrate predictive analytics in real-time.
The magic happens when Data Mesh and AI Mesh overlap. High-volume flows use Data Mesh's structured pipelines (potentially AI-generated for speed of delivery), while AI Mesh handles the rest via agent swarms. This creates a distributed data-serving platform for structured products, enabling multiple data fabrics to act in unison.
Imagine:
Core flows: Data Mesh integrates sales data across domains via ETLs.
Expedient queries: AI agents query low-volume HR tables, routing requests A2A for insights without new pipelines.
Governance overlay: Federated rules ensure security, with agents dynamically enforcing policies.
This hybrid approach reduces complexity. Agents act as general-purpose data-access tools for low-volume scenarios, whereas they build traditional Data Mesh infrastructure for high-volume situations.
Why doesn’t all data flow via agents? Agentic processing is expensive; paid by token. Therefore, it should be restricted to low-volume data flows to reduce costs.
This synergy offers:
Efficiency: Minimum effort for low-value integrations
Scalability: Agents handle growth, and meshes distribute load
Security and compliance: Unified governance with agentic oversight
Innovation: AI-driven insights across all data
Data Mesh and AI Mesh aren't rivals; they're partners, bringing order and harmony to enterprise data chaos. By layering agentic intelligence over decentralized data products, organizations achieve ultimate connectivity—strong at the core, agile at the edges.
This unified approach turns distributed systems into a cohesive whole, powering governance, discovery and innovation. For enterprises considering AI maturity, adopting this hybrid is key.
The future? A seamless mesh where data and AI flow freely, driving unprecedented value.
Josef Habdank is the AI & Data Engineering Global Practice Lead at DXC Technology, specializing in scalable data platforms and GenAI for code generation for DXC. His team manages a global data and AI portfolio, driving technical excellence and expanding market reach. With a firm commitment to ethical AI and advancing artificial general intelligence (AGI) Level 5, Josef empowers organizations to unlock the potential of data-driven innovation.