In short: the data mirage is the illusion that "tracking everything" equals "attributing correctly". Binet & Field, Nielsen and Harvard Business Review show that last-click attribution overestimates short-term performance and underestimates brand building — the true driver of long-term growth.
- 60/40 brand vs activation rule (Binet & Field, IPA Databank).
- Marketing Mix Modeling attributes up to 50-70% of sales impact to brand building, versus the <20% estimated by digital attribution (Nielsen ROI Report).
- Paid digital over-attribution can reach 100-300% vs incrementality tests (Harvard Business Review, WARC).
What the data mirage is in marketing
The data mirage is the phenomenon whereby a dashboard seemingly rich in data — clicks, last-click conversions, platform ROAS — returns a systematically distorted snapshot of real marketing effectiveness. Everything looks measured, everything looks under control, but the budget decisions that follow are based on inflated signals. The result is an average waste estimated between 25% and 40% of spend (eMarketer, Commerce Signals), concentrated precisely on the channels the platform attributes to itself.
The problem is not technological: it is epistemological. Measuring a lot does not mean measuring well. And the most common attribution models — last-click, last-touch, even platform data-driven — are built to optimize a single platform, not to answer the question that truly matters: what would happen to revenue if I turned this channel off tomorrow?
Why last-click attribution misleads
Last-click assigns 100% of the credit to the last touchpoint before conversion. If the user saw a TV spot in January, searched for the brand on Google in March and clicked a Brand Search ad before purchase, last-click celebrates Brand Search. But Brand Search did not create demand: it only intercepted it. Turning it off may lower the declared Google Ads ROAS while leaving sales almost unchanged, as shown by eBay's famous incrementality experiments (Blake, Nosko, Tadelis, 2015).
Controlled geo-lift and ghost ads experiments repeatedly show that the real lift of paid digital is much lower than what platform attribution models claim. Meta, Google and TikTok tend to self-attribute conversions that would have happened anyway — a phenomenon measured by the Marketing Science Institute and covered by Harvard Business Review.
If you want to dig into the technical side, at Deep Marketing we published a dedicated analysis on why 78% of marketers cannot measure attribution.
Attribution: myth vs reality (table)
What marketing science says: Binet & Field, Nielsen, Ehrenberg-Bass
The research by Les Binet and Peter Field on the IPA Databank — the world's largest archive of award-winning effectiveness case studies — produced the famous 60/40 rule: on average, the most effective campaigns allocate about 60% of budget to brand building (long-term goals, broad reach, emotional content) and 40% to activation (short-term performance, promotions, lead gen). The optimal ratio varies by category (B2B tends to 46/54, FMCG to 60/40, pure e-commerce to 50/50), but the order of magnitude is robust across 25+ years of data (Binet & Field, The Long and the Short of It, IPA 2013 and later updates).
Nielsen's Marketing Mix Models, analyzing over 500 brands in recent years, consistently find that creativity and media quality matter more than targeting: creative quality explains on average 47% of incremental sales, versus 36% for reach and targeting (Nielsen ROI Report). The same framework shows that performance digital, alone, underestimates its indirect contribution and overestimates its direct contribution — exactly the data mirage.
The Ehrenberg-Bass Institute (Byron Sharp, How Brands Grow) adds the fundamental piece: brand growth depends mostly on increased penetration (how many people buy the brand at least once), not on frequency among existing customers. This implies that typical "loyalty" and "repeat purchase" metrics — dominant in many dashboards — are second-order indicators, not causal drivers.
For those who want to go deeper on metrics, we recommend our guide on ROAS, MER, LTV and CAC: which to use and when.
How to allocate brand vs performance budget in 2026
2026 does not make the 60/40 rule obsolete: it makes it harder to apply, because third-party cookie deprecation, AI in platforms and channel fragmentation make tracking even noisier. The pragmatic playbook we use at Deep Marketing rests on three pillars.
1. Always triangulate three sources. Platform attribution + MMM (even a light one, with open tools like Meta Robyn or Google Meridian) + at least one incrementality test per year for each critical channel. No single source is reliable alone. When the three diverge, the truth is usually close to incrementality.
2. Protect brand budget even in recession. IPA analysis shows that brands that cut brand budget in downturns recover much more slowly when the market restarts. A strategic guide on the topic is our article Media Mix Modeling: the measurable marketing revolution.
3. Judge channels by their role, not by their ROAS. Brand Search will always have very high ROAS: it serves already warm demand. Awareness paid social will always have low last-click ROAS: it builds demand. If you judge them with the same metric, you systematically cut the wrong lever. This is the heart of the data mirage.
One last practical caveat for 2026: beware of platform auto-optimization when the feedback loop is based only on last-click conversions. Smart Bidding, Advantage+ and similar tools maximize what the platform can measure — which is, by construction, a subset of reality. Feeding these algorithms with value signals closer to margin (LTV, offline sales, qualified leads verified by sales) tangibly reduces drift toward the data mirage and improves the quality of acquired traffic, even at the same spend.
Need to optimize your 2026 marketing budget?
Deep Marketing helps SMEs and brands balance brand building and performance with methodologies inspired by MMM and incrementality testing, choosing channels by their role in the funnel and not by platform ROAS. Request a budget audit or discover our digital advertising consulting.
Frequently Asked Questions
What is the data mirage in marketing?
The data mirage is the illusion that a data-rich dashboard is also accurate. Last-click and platform multi-touch attribution models overestimate low-funnel channels and underestimate brand building, TV and OOH. The result: budget decisions based on inflated signals, with average waste estimated between 25% and 40% of spend (eMarketer, DemandScience, Commerce Signals).
What's the difference between MMM and attribution?
Attribution works at the single-user level, reconstructing tracked touchpoints before conversion: it sees digital with cookie/ID well, little or nothing of the rest. Marketing Mix Modeling works at aggregate level, estimating via regression the impact of every channel (online and offline) on total sales, including TV, OOH and word-of-mouth. They are complementary, not alternative.
Why do Binet & Field say 60/40?
Analyzing over 1,400 case studies awarded for effectiveness in the IPA Databank, Binet & Field found that campaigns most effective over the long term allocate about 60% of budget to brand building and 40% to activation. Brand building builds memory and preference, activation converts existing demand. Skewing to 100% performance produces peaks followed by progressive erosion of effectiveness.
Does it make sense to track everything today?
No, and above all it is not realistic. Third-party cookies in deprecation, iOS ATT, consent mode and multi-device fragmentation make user-level tracking structurally incomplete. The sensible direction in 2026 is combining imperfect attribution with MMM and periodic incrementality tests, accepting probabilistic estimates instead of chasing an impossible deterministic precision.
How to avoid wasting marketing budget?
Three practical rules: (1) judge every channel based on its funnel role, not with the same ROAS metric; (2) protect a share of brand budget even under short-term pressure; (3) always triangulate platform attribution, MMM and at least one incrementality test per year on the most relevant channels. When the three sources diverge, trust the incremental test.
Sources and References
- IPA Databank — Binet & Field, The Long and the Short of It (60/40 rule)
- Nielsen — ROI Report & Marketing Mix Modeling Insights
- Harvard Business Review — Advertisers Should Act More Like Scientists (incrementality)
- WARC — The Attribution Problem
- Ehrenberg-Bass Institute — Byron Sharp, How Brands Grow
- Blake, Nosko, Tadelis (2015) — Consumer Heterogeneity and Paid Search Effectiveness: eBay Field Experiment
- eMarketer — Marketers Waste About One-Fourth of Their Budgets


