For a decade, the marketing technology stack was sold as a ladder: add best of breed tools, connect them with integrations, and climb towards personalisation at scale. In practice, many large organisations ended up with a toolchain that looks less like a ladder and more like a plate of spaghetti: overlapping licences, brittle connectors, and slow change control.
The evidence of underuse is now hard to ignore. Gartner says martech utilisation has fallen to 49%, and that only 49% of tools are actively used. It also estimates martech accounts for nearly 22% of total marketing spend, a large enough slice that finance scrutiny is inevitable. This is the economic backdrop to “stack rationalisation”: the argument is not ideological. It is a budgeting conversation.
Yet the number of tools is not collapsing neatly. MarTech’s 2025 State of Your Stack Survey found 62.1% of respondents used more tools than two years earlier, 23.1% used about the same number, and 14.8% used fewer. That 14.8% figure is real tool reduction, but it is still the minority. The first-order change is not fewer boxes on a slide. It is a shift in where “the system” lives, and what does the work.
The operational drivers are consistent across industries. In the same survey, 65.7% cited data integration as their biggest stack management challenge. When asked about future risk, the top concerns were data silos, cost of ownership, adaptation to change, skills gaps, and privacy and security. Vendor lock-in was a mid-tier concern, but it rose for enterprises, and it tends to surface late, once migrations become costly.
The stack is shrinking, then, in a specific sense: fewer teams want to own a maze of point-to-point integrations, and fewer leaders want marketing outcomes to depend on whether a connector breaks on a Friday night. That pressure is pushing marketing technology towards fewer centres of gravity: unified data layers, shared identity and consent services, and AI tooling that sits across workflows rather than inside a single channel tool.
From toolchain to AI-first operating system
The emerging replacement for the traditional stack is not a single product. It is an operating model: an AI-first “marketing operating system” that treats data, decisioning, content, and orchestration as a continuous loop. The meaningful change is that intelligence and workflow move closer together. Instead of a marketer exporting an audience, uploading to a channel tool, waiting for results, and iterating manually, the system can decide and act in near real time, with humans supervising and setting guardrails.
This is why the most revealing vendor announcements of the past two years have not been about a new email feature. They have been about agent runtimes, grounding layers, and workflow automation.
At Salesforce, the pivot has been explicit. In its Einstein Copilot announcement, Marc Benioff framed the moment in sweeping terms: “AI is the single most important moment in the history of our industry,” emphasising deep integration of data and metadata as the differentiator. Beneath the rhetoric is a technical thesis: AI assistants become useful when they are grounded in unified enterprise data, can invoke governed actions, and can automate across systems rather than generate text in isolation.
Adobe has made a similar bet, but with a marketing practitioner lens. Its announcement of Adobe Experience Platform AI Assistant stressed task automation, audience and journey generation, and simulation of outcomes through a conversational interface built into Experience Cloud applications. Anjul Bhambhri argued that AI assistants are driving a paradigm shift within enterprise software, pointing to platform scale as the reason Adobe could ship an out-of-the-box assistant that draws on the profiles and interactions already flowing through its platform.
Microsoft’s language is less marketing-specific, but the architecture implied is similar: AI moves from answering questions to executing multi-step tasks, and organisations restructure around a truly integrated system. Satya Nadella wrote in March 2026 that AI experiences are evolving from answering questions and suggesting code, to executing multi-step tasks with clear user control points, and positioned Copilot as a unifying layer across agents, apps, and workflows.
Taken together, these announcements sketch the same destination: an AI layer that is natively embedded, grounded in governed enterprise data, and able to trigger and execute actions across workflows. That is why the stack is “collapsing” into an operating system: not because tools vanish, but because the intelligence and control plane moves up a layer.
CDP versus cloud versus AI-native and the new centres of gravity
The fight for the centre of the stack used to be framed as suites versus best of breed. Today it is increasingly framed as data gravity. Wherever the cleanest, most complete customer data lives becomes the easiest place to run models, evaluate outcomes, and activate experiences.
This shift shows up in how practitioners describe their own architectures. In a 2025 analysis of the martech landscape, chiefmartec reported 15,384 solutions in the market, up year-on-year, while also documenting meaningful churn through acquisition or shutdown. This is consolidation in the only form that ultimately matters: categories thinning and weak vendors disappearing, even while new AI-native entrants keep spawning.
The CDP category is a good illustration of the new squeeze. CDPs are increasingly being absorbed into warehouse-led composable architectures or embedded directly into engagement systems. At the same time, cloud data warehouses are becoming the named centre platform, and engagement platforms are growing at the activation layer.
This is reinforced by how teams are building. More than half of organisations now integrate their martech stack with a cloud data warehouse or data lake, treating it as a universal data layer and source of truth. This is the operational meaning of data gravity: if governance, lineage, and access controls already exist in the warehouse, AI and workflows will follow it.
The arrival of AI agents increases the pull. Generative AI is now mainstream inside stacks, but the next phase is not just content. Teams expect AI to drive personalisation, automated marketing tasks, data analysis, and decisioning. These are system-level capabilities.
This helps explain why AI-native platforms are being defined less by whether they have a chatbot, and more by whether they can operate as a control plane: orchestrate journeys, invoke actions, track outcomes, and run evaluation loops. It also explains the comeback of homegrown capabilities, as AI lowers the cost of building workflow apps while increasing the need for a stable backbone.
A practical way to interpret this market is that there are now three competing operating system approaches: the CRM-led data cloud, the experience platform-led journey layer, and the enterprise productivity ecosystem making customer data agent-accessible across workflows.
India in practice with BFSI and D2C workflows
In India, the case for AI-first operating systems is unusually concrete because of scale, fragmentation, and hybrid digital-physical journeys. Two contrasting patterns stand out: BFSI organisations treating orchestration as a risk-managed decision engine, and D2C organisations treating it as a retention and velocity machine.
In banking, the workflow is increasingly data to decision to action inside a governed runtime. IndusInd Bank offers a clear example. Campaigns that once took days due to data unification and segmentation now happen in minutes through integrated systems. The workflow involves combining credit score, behaviour, and demographics to trigger real-time offers. Charu Mathur frames this as executing contextual, highly relevant personalised campaigns in minutes.
Adobe’s work with HDFC Bank shows how orchestration spans both customers and employees, integrating insights across branches and digital channels.
D2C organisations operate with a different goal: speed and repeat purchase. Mamaearth, part of Honasa Consumer Ltd., focuses on retention through post-purchase journeys and hyper-personalisation. Abhishek Gupta highlights how intelligent insights enable targeted communication.
Lenskart demonstrates how D2C in India blends offline and online data, enabling faster execution and improved ROI.
The difference between BFSI and D2C is not capability but constraint. BFSI prioritises governance and compliance, while D2C prioritises speed and experimentation. Both converge on the same architecture: unified data, embedded intelligence, and continuous orchestration.
Migration strategies and the lock-in trap
For most enterprises, stack collapse is not a rip-and-replace. It is a phased migration, as legacy systems are deeply tied to revenue operations and compliance.
The starting point is an audit: mapping tools, dependencies, and utilisation. The biggest cost of a bloated stack is not licensing but operational drag: time spent managing integrations and inconsistent data.
Migration works best when it is use-case driven and architecture-aware. Integration challenges often lead to consolidation, but that can create new lock-in risks. The solution is to treat the AI-first system as an interoperability layer, not a closed ecosystem.
This means building exit strategies early: data portability, open schemas, event streaming, and vendor-agnostic capabilities.
AI introduces a final risk. Agents make systems complex rather than just complicated. Automation can amplify errors, making governance, guardrails, and observability essential.
Skills, operating model, and the adoption clock
The biggest impact of stack collapse is organisational. It changes the role of the CMO and how marketing works with IT and data teams.
A majority of CMOs believe AI will fundamentally change their role within two years. The shift is from campaign execution to systems design: deciding what gets automated, what stays human, and how governance works.
Adoption is still uneven. Many organisations are experimenting with AI, but deep integration into workflows is limited. The next 12 to 18 months will see a gap between AI excitement and operational reality.
In the near term, three shifts will dominate: embedded assistants in core platforms, a move from batch to real-time marketing, and early deployment of agents in governed environments.
In the medium term, the operating system model becomes literal. AI will unify agents, apps, and workflows into a single system layer.
Regulation will accelerate this shift. In India, new data protection rules are pushing consent, retention, and compliance into core operations. Globally, frameworks like the EU AI Act will require explainability and auditability.
The market outlook is paradoxical but clear. Vendor numbers may continue to grow, but the traditional stack model is disappearing. Decisioning, orchestration, and measurement are consolidating into fewer layers.
The winners will be those who reduce operational drag, maintain strong data governance, and turn AI into a reliable engine for executing customer workflows.
Disclaimer: All data points and statistics are attributed to published research studies and verified market research. All quotes are either sourced directly or attributed to public statements.