AI-Driven Marketing Governance: Who Decides What Now?
- Mladen Tošić

- Nov 2, 2025
- 7 min read
Updated: Nov 3, 2025
Governance and Decision Rights in AI-Driven Marketing
Everyone’s talking about AI use cases in marketing. Almost no one is talking about who decides what once AI enters the room - AI-driven marketing governance.
That’s the real test of readiness.
AI’s promise is to make marketing faster, more data-driven and more adaptive. Yet most marketing operating models were built for a slower age — annual plans, agency-led creative, weekly performance meetings. They weren’t designed for algorithms that can generate 30 creative variants before lunch or reallocate media spend in real time.
So the question becomes: how do you make decisions at the speed of AI without losing control of your brand, budget, or judgement?
If you’ve read my earlier post on the marketing operating model for an AI-enabled world, you’ll know I define an operating model as more than an org chart. It’s the choreography of people, partners, data, technology, processes — and, above all, behaviour. Governance sits inside that choreography, not next to it.
Two schools, one challenge
There are, broadly speaking, two schools of operating-model design. The rule-based one codifies structure, accountabilities, decision rights and forums — the dominant mode in large organisations and most typical consulting playbooks. The principle-based one focuses on behaviours, culture, trust and shared language — the secret sauce of sustained value creators and far more common in fast-moving start-ups. In AI-driven marketing, you’ll need a blend: enough hard structure to stay accountable, enough soft tissue to stay adaptive.
Go only hard and you slow AI down.
Go only soft and you lose grip on risk, brand and spend.
Start with the decisions
The most effective operating-model work I learned at Bain always began with the critical decisions — the value moments that determine performance.
Indeed, in practice, my clients always appreciated how this approach cut through the politics and noise to get to the heart of how their business creates value. Those decisions rarely change; what changes is how you make them, who is involved, and what information they use.
So start there, then design in parallel:
the hard enablers (roles, structure, data, technology, decision rights, accountabilities, forums, guardrails); and
the soft enablers (behaviours, cadence, shared language, the confidence to challenge).
Hard and soft aren’t sequential — they’re twin systems. In AI-driven marketing they must evolve together, or the rules get ignored and the culture gets fluffy.
When the hard layer falls behind, AI either stalls or goes rogue. That might mean roles and structures that still mirror old campaign processes, fragmented data across teams, or tech that doesn’t yet connect creative, media and finance decisions. When the soft layer lags, people default to old habits — waiting for sign-offs, distrusting the model, or letting performance teams run unchecked.
Today, when I work with clients, we begin with a short diagnostic, sometimes just a few conversations, to pinpoint which marketing decisions matter most (or need most attention) — those with the biggest value upside and the highest feasibility given where the organisation is starting from. The approach is always tailored to context.
That diagnostic looks at both layers: whether decision rights are clear, but also whether roles, data, and tools truly support faster, smarter decisions.

Meet FunkyFuture Co
To make this real, lets picture FunkyFuture Co, a mid-size wellness brand modernising its marketing with AI. While it's not a real company, the challenges and approaches are — they draw on real client situations I’ve encountered over the years.
FunkyFuture Co budget is under pressure - they want to move faster, waste less media and make content more relevant — without losing their brand soul.
Three decisions matter most right now:
Marketing planning and budget allocation
Content and creative activation
In-campaign performance optimisation
1. Marketing planning and budget allocation
The decision:
“How do we set next quarter’s spend split between brand and performance using AI-generated forecasts?”
This example assumes the total marketing budget has already been agreed. The decision here is about allocation — how to distribute spend across channels and objectives to make that investment work harder.
Before:
Annual planning. Long meetings. Gut feel dressed as data.
Now:
AI runs scenario models monthly — exploring, for example, what happens if 10 per cent of spend moves from social to search, or if upper-funnel investment increases before a product launch.
Finance validates affordability; data science tests assumptions; the CMO still signs the decision.
What changes:
Hard layer: the forum moves from once a year to monthly; model outputs circulate before meetings; ownership is explicit — marketing decides, finance validates, data supports. Roles and structure shift toward smaller, cross-functional planning pods combining marketing, finance and data expertise. Teams need cleaner, faster access to data and scenario-modelling tools; technology must enable dynamic budget re-allocation and shared visibility across departments.
Soft layer: behaviour shifts from “model-as-proof” to “model-as-input.” Overrides are logged with rationale, and everyone speaks the same language about uncertainty.
Illustrative change at FunkyFuture Co:
Metric | Before | After |
Decisions supported by scenario models | 0 % | 70 % |
Time to approve in-quarter reallocation | 3–4 weeks | 3–5 days |
Variance to planned media mix | ±15 % | ±5 % |
Why it matters:
Better alignment between planned and actual media spend may only shift efficiency by a few percentage points — but at scale, those points matter.
Across large advertisers, studies like Nielsen’s Predictive ROI analyses show that small gaps in spend allocation can translate into meaningful lost value. Even a 1–3 per cent improvement in media efficiency — modest by most benchmarks — can free up €200–600 k on a €20 million budget, along with several FTE-months of capacity to reinvest in insight or innovation.
Equally important, leaner planning teams and connected data infrastructure make those gains sustainable. The next quarter’s plan can start from live performance data rather than static reports — a structural change as much as a behavioural one.
2. Content and creative activation
The decision:
“Which creative variant goes live for each audience or market this week?”
Before:
The agency built one master asset.
Markets asked for tweaks.
Brand approved.
Performance tested later.
Now:
Platforms such as Celtra, Smartly.io (or others) generate dynamic variants from brand-approved templates.
AI proposes copy or image tweaks; the number of live assets multiplies overnight.
What changes:
Hard layer: brand rules encoded in the platform; clearer roles across brand, creative, and performance teams; smaller but more integrated production pods using shared templates and automation tools; better use of data to inform creative briefs and a tech stack that connects asset creation to real-time performance data. Sign-off tiers are defined — “green” variants auto-publish, “amber/red” require human review.
Soft layer: curiosity replaces control. Teams share a common critique language that lets brand, creative and data talk about performance and distinctiveness in the same sentence.
Illustrative change:
Metric | Before | After |
On-brand assets first-time | 70 % | 90 % |
Time to localise five markets | 10 days | 2 days |
Assets needing human sign-off | 100 % | ≈ 35 % |
Why it matters:
Automation can reduce production costs by roughly 20–40 per cent (industry estimates). More personalised, contextually relevant creative tends to outperform generic executions — IPA’s work on effectiveness is clear on that direction of travel — but the size of the lift depends on channel, sector and how the personalisation is done. Industry evidence and recent meta-analysis suggest that low-to-mid double-digit gains are realistic when it’s done well, say in the 5–15% range on engagement-type metrics.
And, as the Ehrenberg-Bass Institute reminds us, growth comes from building both mental and physical availability. When distinctiveness and distribution improve together, the impact compounds.
For FunkyFuture Co, faster, leaner teams, stronger data links, and smarter automation mean creative work that’s both efficient and distinctive — and better aligned with growth.
In practice, success depends on collaboration — brand, agency and martech partners each have a role in governing how AI and creativity meet.
3. In-campaign performance optimisation
The decision:
“When do we override the algorithm during a live campaign?”
Before:
Weekly review calls. Manual tweaks. Slow reactions.
Now:
Budgets and bids update continuously.
But the model can over-optimise for cheap clicks and miss long-term brand value.
Humans still need to know when to step in.
What changes:
Hard layer: define thresholds for human review (e.g. sudden cost spikes or brand-safety breaches); set clear escalation paths and log every override. Team structure shifts towards smaller cross-functional pods combining media, analytics and brand decision-makers. Roles evolve from channel specialists to performance stewards with a broader view of brand impact. Data pipelines and dashboards need to surface exceptions automatically, and technology must enable override tracking and learning loops.
Soft layer: build confidence to challenge the algorithm and capture lessons so both people and models improve.
Illustrative change:
Metric | Before | After |
Response time to anomalies | 48 hrs | 4 hrs |
Overrides with documented rationale | 10 % | 90 % |
Campaign volatility week-to-week | High | Controlled |
Why it matters:
Industry benchmarking shows that moving from weekly to faster optimisation cycles can lift incremental performance by around 10–20 per cent. For FunkyFuture Co’s €10 million activation programme, that translates to roughly €1–2 million in additional short-term sales effect — often worth several times more over the long term — without any extra media spend.
What it all shows
The decisions haven’t changed — the decision-making has.
Hard and soft enablers must evolve together — roles, structure, data and technology on one side; behaviours and culture on the other. Guardrails keep you safe; culture keeps you fast.
AI doesn’t replace judgement — it exposes it. The technology is neutral; the quality of your decisions is not.
This tension isn’t new — AI just compresses time. Weak governance breaks sooner.
And it all comes back to value. Better decisions cut waste, unlock capacity, and drive growth — exactly what I aim to quantify through the Marketing Value Engine™, a framework I use to link marketing choices to business outcomes.
Where to start
“If you want AI to make your marketing better, start by listing the three to five decisions your team must get right every month (or must improve on).”
For each, ask two questions:
What do we want the machine to do here — and what data or technology do we need to support it?
What roles and behaviours do we need from people so it stays on-brand and accountable?
Do that, and AI becomes a force multiplier — not a source of governance debt.



Comments