Meta's 2026 attribution changes: what marketers can actually still measure
On 12 January 2026, Meta removed the 7-day-click and 28-day-click attribution window options from Ads Manager. The default and only available click attribution is now 1-day click for direct-response campaigns and a Meta-managed setting for everything else. Combined with the iOS attribution gap that has widened to 40-70% on most accounts, this is the largest single hit to Meta-reported performance visibility since the 2021 ATT rollout.
The result on most dashboards has been a reported CPA that appears 15-30% higher than the equivalent number under the old rules, ROAS that appears proportionally lower, and a reporting reality that no longer matches the operational reality of the same campaigns. This guide covers what changed in January, what remains trustworthy on Meta, how Google Ads and TikTok signals fill the read, and the practical reconciliation pattern for cross-platform measurement under the new rules.
What changed in January 2026
Three things changed at once.
Attribution window removal. The 7-day-click + 1-day-view default was the standard for direct-response Meta campaigns through the end of 2025. The 28-day-click and 28-day-view options were available for longer consideration cycles. Both were removed on 12 January 2026. Active campaigns using the longer windows were forcibly migrated to the shorter default during the same week.
Attribution window display in reports. Historical reports beyond January 2026 are presented under the new attribution rules, not the rules that were in effect when the conversions originally fired. This is consequential for year-over-year analysis: a January 2026 number is structurally not comparable to a January 2025 number even though both come from the same Ads Manager.
iOS gap widening. Independent of the attribution window change, the iOS-side attribution gap has widened during 2025 and into 2026 as Apple's privacy framework has tightened. Industry estimates put the gap at 40-70% — meaning that for every 100 conversions a privacy-clean tracking system would attribute to Meta on iOS, Meta now reports somewhere between 30 and 60. The gap varies by vertical, audience composition, and conversion type.
What still works on Meta
Three Meta-side measurement surfaces remain trustworthy under the new rules.
Conversions API (server-side events)
The Conversions API is the server-to-server pipe that sends conversion events from your backend directly to Meta, independent of browser-side pixel tracking. CAPI events are not subject to ATT-driven iOS signal loss because they originate server-side after the conversion has fired in your own systems.
CAPI is the single most important measurement upgrade an account can make under the 2026 rules. Accounts with strong CAPI implementations report iOS attribution gaps closer to 20% than 40-70%. Accounts running CAPI alongside the pixel see Meta's deduplication merge the signals correctly, and the merged data is what feeds the delivery optimisation algorithm — so CAPI improves not just measurement but also delivery quality.
CAPI is not a complete fix. It still operates inside Meta's attribution window, so the 7-day-click removal still bites. But the iOS gap inside CAPI-instrumented accounts is substantially smaller.
Conversion Lift studies
Conversion Lift is Meta's own randomised-holdout incrementality measurement. A Lift study runs a holdout group that is excluded from seeing the campaign for the duration of the test, and compares conversion rates between the holdout and the exposed group. The output is a number for incremental conversions caused by Meta — not attributed by Meta, caused by Meta.
Lift studies are not affected by attribution-window changes because they do not depend on attribution at all. They are affected by the iOS gap only insofar as the gap creates noise; the holdout-vs-exposed comparison is largely robust to the gap as long as the gap applies similarly to both groups.
The constraint with Lift is cost and scale: a Lift study needs sufficient spend and sufficient duration (typically 4-8 weeks) to produce statistically meaningful output. Most accounts do not run continuous Lift; they run periodic studies as ground-truth reference points and use Ads Manager numbers in between.
MMM (media mix modelling)
Marketing-mix modelling is the older incrementality discipline, dating to the pre-digital era and adapted for digital channels. MMM uses time-series regression on aggregate spend and conversion data — no user-level tracking required — to attribute conversions to channels and to specific channel components.
MMM has been quietly rehabilitated as the primary cross-platform measurement layer in the post-ATT era. It is robust to attribution-window changes (it ignores them) and robust to iOS gaps (also ignores them). The cost is methodological complexity — MMM requires clean spend and conversion data over a long time window, and the model needs to be re-fit periodically.
For accounts above $50K/month total ad spend, MMM is the best long-term answer to the 2026 rules. Several open-source frameworks (Robyn, LightweightMMM, PyMC-Marketing) make it more accessible than it was five years ago.
Filling the read with Google Ads and TikTok signals
The new Meta rules also make cross-platform reconciliation more important, because Meta-reported numbers in isolation are noisier than they were.
Google Ads as a triangulation reference
Google Ads' attribution windows did not change in January 2026. Default click attribution is still 30-day click for most conversion actions. This stability gives Google Ads numbers a longer historical comparability than Meta numbers and makes them useful as a triangulation reference.
For accounts running both, a useful pattern is: read Meta CPA against Google Ads CPA for the same conversion event over a rolling 28-day window. If both move together, the underlying conversion is moving. If only Meta moves, the move is more likely measurement-side than performance-side, and an MMM or Lift check can confirm.
TikTok signals for top-of-funnel volume
TikTok's attribution windows also did not change — defaults remain 7-day-click + 1-day-view (oddly, the same rule Meta dropped). TikTok's iOS gap is structurally smaller than Meta's because TikTok runs more of its measurement through its own SDK rather than ATT-mediated pixel.
TikTok numbers are most useful as top-of-funnel volume signal, less so as bottom-of-funnel CPA truth (TikTok's own optimisation has different biases). For cross-platform reading, TikTok works best as the third leg of a triangulation pattern: Meta + Google Ads + TikTok converging on similar movement gives high confidence that the underlying number is real.
For the full cross-platform reconciliation pattern, see reading cross-platform ROAS across Meta, Google, and TikTok.
Practical reconciliation pattern
A workable pattern for accounts running the three platforms together under the 2026 rules:
- Daily: read Meta CPA from Ads Manager (with the new attribution rules) for tactical bid decisions. Treat the absolute number as a relative signal — it is the right ranking metric across creatives, the wrong absolute number for ROI calculation.
- Weekly: reconcile Meta + Google Ads + TikTok numbers for the same conversion event. Look for directional alignment more than absolute alignment. Three platforms moving together = real change.
- Monthly: re-fit the MMM (or run an MMM if not already in place). Use MMM-attributed conversions as the truth-baseline for budget allocation decisions across platforms.
- Quarterly: run a Conversion Lift study on Meta. Compare Lift-measured incrementality against MMM-attributed incrementality. Disagreements are diagnostic — they usually point to a measurement-pipeline issue (missing CAPI events, broken tracking, mis-defined conversion event).
This pattern is more work than the pre-2026 single-source-of-truth model. The upside is that it produces a number that is robust to any single platform's attribution-rule changes — which, given Meta's track record, is worth the overhead.
What this changes about creative decisions
The 2026 rules change creative-level decisions less than they change account-level budget decisions. Creative-level performance comparisons (this creative versus that creative inside the same Meta account) are still valid because they are reading the same numbers under the same rules. Account-level budget comparisons (Meta versus Google Ads versus TikTok) are where the noise has compounded.
The implication for creative work: keep running the closed-loop creative iteration (what to test next) inside each platform separately, against the platform's own reported numbers. Use cross-platform reconciliation only at the budget-allocation layer above the creative iteration. Do not let attribution-rule changes contaminate the creative-level decisions, where the rules apply equally to every creative being compared.
What Omniscia reads on this
Cortex's cross-account benchmarks track CPA distributions across the customer base on each of Meta, Google Ads, and TikTok separately. The benchmarks are continuously re-fit, so the comparison stays calibrated to the rules in effect. Nexus aggregates per-creative performance across the three platforms and surfaces the directional alignment described above — three platforms moving together on the same creative is the cleanest signal an account can read under the 2026 rules.
For the upstream piece on diagnosing creative performance ahead of CPA reporting, see early signs of creative fatigue.
Frequently asked questions
Which Meta attribution windows are still available?
After 12 January 2026, the only click attribution available in Ads Manager is 1-day click for direct-response campaigns. View-through attribution defaults to 1-day. The 7-day-click default and the 28-day-click and 28-day-view options were removed. Meta has indicated longer windows may return in a different form (modelled rather than direct attribution), but as of writing they are not available.
How big is the iOS attribution gap on Meta in 2026?
Independent industry estimates put the iOS attribution gap on Meta at 40-70% for accounts without strong Conversions API instrumentation, and closer to 20% for accounts running CAPI alongside the browser pixel. The gap varies by vertical (higher for ecommerce DTC, lower for lead-gen), by conversion type (higher for purchase events, lower for lead-form events), and by audience iOS share.
Does Google Ads still show 30-day click attribution?
Yes. Google Ads default click attribution remains 30-day click for most conversion actions. Google has not announced changes equivalent to Meta's January 2026 window removal. This stability makes Google Ads numbers more comparable year-over-year than Meta numbers and useful as a triangulation reference for cross-platform reconciliation.
Can I reconcile Meta, Google Ads, and TikTok conversions?
Yes, but not by summing the three. Each platform reports the same conversion under its own attribution rules, so summing produces a number larger than the actual conversion count. Reconciliation works at the directional level (do the three platforms move together?) and at the incrementality level (run MMM or Lift studies to attribute incremental conversions per platform). For sub-monthly cycles, look for directional alignment; for monthly budget decisions, use MMM.
What is the workaround for the missing 28-day window?
For accounts that need longer-window measurement (high-consideration purchases, B2B, subscription products), the workarounds are: use Conversions API with first-party data to capture the full conversion path server-side; run periodic Conversion Lift studies for ground-truth incrementality; build an MMM that operates on a 90-day or longer time window. None of these recover the in-Ads-Manager 28-day view, but together they recover the underlying measurement insight the 28-day window provided.