Loading…
Loading…
Performance Max sent 63% of our budget to branded search. Google didn't tell us — we found out ourselves. Here's every number from 14 campaigns.
Between January and March 2026, we managed £42,000 in Google Ads spend across 14 campaigns for UK-based B2B and e-commerce clients. We tracked everything — not just the numbers Google wanted us to see, but the actual commercial outcomes that hit our clients' bank accounts.
The reason was simple: we were tired of the gap between what Google Ads reports and what actually happens. Every agency shows you platform ROAS. We wanted to show you real ROAS — and more importantly, which campaign types, bidding strategies, and structures actually drove profitable revenue in the current UK market.
What follows is the unfiltered data. Some of it makes Google look good. Some of it doesn't. We're publishing it because this is the kind of information we wish existed when we were building our own campaigns.
Here's where the money went. We split the £42,000 across four campaign types: Search (manual CPC and automated), Performance Max, YouTube action campaigns, and Demand Gen. The allocation wasn't equal — it was weighted toward what we expected to perform based on historical data, then adjusted weekly based on real results.
Search campaigns received £18,400 (44% of total spend). Of that, £9,200 went to manual CPC campaigns and £9,200 to Target CPA automated bidding — a deliberate 50/50 split to test the automation question directly. Performance Max received £14,700 (35%). YouTube action campaigns got £5,600 (13%), and Demand Gen took £3,300 (8%).
The industries covered: two SaaS companies (£16K combined), one e-commerce brand selling premium homeware (£14K), one professional services firm (£8K), and one local trades business (£4K). All UK-focused, all tracking offline conversions back into Google Ads.
Performance Max was our biggest disappointment — not because it didn't generate conversions, but because of where those conversions came from. When we dug into the placement reports and ran brand exclusion tests, we found that 63% of PMax conversions were coming from branded search terms. Google was essentially charging us to capture demand we already owned.
Here's the maths. PMax reported a 5.8x ROAS across our campaigns. Impressive, right? But when we excluded branded traffic (which would have come through organic or a £0.30 branded search campaign anyway), the non-branded PMax ROAS dropped to 1.9x. Factor in our clients' average 45% gross margin, and the real commercial ROAS on incremental PMax traffic was 0.86x. That's a loss.
We ran a controlled test for the e-commerce client: two weeks with PMax on, two weeks with PMax off but a dedicated branded search campaign running. Total revenue barely moved — it dropped 4%, well within normal variance. But ad spend dropped 38%. The brand was paying a premium for PMax to claim conversions that Search would have captured anyway at a fraction of the cost.
This doesn't mean PMax is useless. For the local trades business with minimal brand recognition, PMax genuinely discovered new audiences through Display and Discover placements. But for any brand with existing search demand, PMax is an expensive way to buy your own traffic.
This is the test we're most confident about because we controlled for variables carefully. Same budgets, same landing pages, same audiences, same time periods. The only difference was bidding strategy: manual CPC with regular bid adjustments versus Target CPA with automated bidding.
Over 12 weeks, manual CPC campaigns delivered a CPA of £34.20 with 269 conversions. Target CPA campaigns delivered a CPA of £28.60 with 322 conversions. Automated bidding won — but with an important caveat. The first three weeks of Target CPA were terrible. CPA was £52 while the algorithm learned. If we'd judged it at the two-week mark (as many businesses do), we'd have killed it.
The quality gap was more interesting. When we tracked conversions to actual revenue (closed deals for B2B, completed purchases minus returns for e-commerce), manual CPC conversions had a 23% higher close rate. The Target CPA algorithm was optimising for the conversion event — form fills, add-to-carts — but it was finding people more likely to convert on the form and less likely to convert on the sale.
Net result: automated bidding produced more conversions at a lower CPA, but manual CPC produced more revenue per pound spent. For businesses with high-value, considered purchases, manual CPC with proper bid management still wins on commercial outcomes. For high-volume, lower-consideration purchases, let the algorithm do its thing — but give it at least 6 weeks before judging.
Everyone quotes Google's benchmark data. Here's what we actually paid across our UK campaigns in Q1 2026, broken down by industry and match type.
SaaS (B2B): Exact match averaged £4.80 per click, phrase match £3.90, broad match £2.60. The broad match CPCs were lower but conversion rates were 40% worse, making exact match the most cost-effective on a CPA basis. The most expensive keyword we ran was 'enterprise project management software' at £11.40 per click — and it was worth every penny with a 8.2% conversion rate.
E-commerce (premium homeware): Exact match £1.40, phrase match £1.10, broad match £0.70. Shopping campaigns averaged £0.45 per click with a 2.8% conversion rate. The surprise here was that text ads outperformed Shopping on ROAS for high-AOV items (over £200). For items under £100, Shopping crushed everything else.
Professional services: Exact match £6.20, phrase match £5.10, broad match £3.80. Professional services had the highest CPCs but also the highest average deal value (£8,500), making the unit economics work despite the expensive clicks. The key was aggressive negative keyword management — without it, broad match wasted 35% of spend on irrelevant queries.
Local trades: Exact match £3.10, phrase match £2.40, broad match £1.80. Local Services Ads (LSAs) outperformed standard Search by 2.3x on CPA. If you're a local business and you're not running LSAs, you're overpaying for every lead.
YouTube action campaigns and Demand Gen were our lowest-performing channels on direct CPA — and that's exactly what we expected. These are top-of-funnel channels being measured on bottom-of-funnel metrics, which is inherently unfair. But we ran them, so here are the numbers.
YouTube action campaigns: £5,600 spend, 23 direct conversions, £243 CPA. Terrible on paper. But during the 8 weeks these campaigns ran, branded search volume for the SaaS client increased 34%, and the retargeting pool grew by 12,000 users. The retargeting campaigns running alongside had their CPA drop from £41 to £29. Attribution models struggle with this, but the causation was clear from the timing.
Demand Gen: £3,300 spend, 18 direct conversions, £183 CPA. Similar story — poor direct response, but measurable lift in brand awareness metrics. The Gmail placement in Demand Gen was surprisingly effective for the professional services client, generating 8 of the 18 conversions at a £94 CPA. If Google let us target Gmail placements alone, we'd double our Demand Gen budget tomorrow.
Our take: if you need every pound to generate a direct, trackable conversion this month, skip YouTube and Demand Gen entirely. If you can allocate 15-20% of budget to pipeline building and you're disciplined about measuring indirect effects, they work — but you need a 3-month minimum commitment to see results.
Transparency works both ways, so here's what didn't go as planned.
We underestimated how much PMax would cannibalise branded search. We should have set up brand exclusions from day one, but Google made this difficult (you need to submit a brand list through Google support, and it took 11 days to get approved). During those 11 days, PMax spent roughly £2,800 on branded terms that should have cost £400 through a standard branded campaign. That's £2,400 wasted because of a process bottleneck.
We over-rotated on exact match for the e-commerce client. Our instinct was to keep match types tight, but the Shopping campaigns and broad match Discovery campaigns found genuinely new product searches we hadn't considered. 'Scandinavian-style table lamp' wasn't in our keyword list, but it converted at 4.1%. We left money on the table for the first three weeks by being too conservative.
We also set Target CPA targets too aggressively at launch, which caused the algorithm to barely spend for the first week. When we loosened the targets by 30% and let the algorithm find its range, performance improved dramatically. The lesson: give automated bidding a realistic target to start with, then tighten as it learns.
First, we'd launch PMax with brand exclusions from day one. No exceptions. Submit the brand list to Google the moment the account is created, not when you notice the cannibalisation. This single change would have saved approximately £3,200 across our campaigns.
Second, we'd give automated bidding a longer leash at launch. Start with a Target CPA that's 20-30% above where you want to end up. Let the algorithm spend and learn for 4-6 weeks. Then gradually tighten. Trying to force a tight CPA target from the start just throttles spend and delays learning.
Third, we'd allocate 15% of budget to broad match discovery from the start, even in B2B. The keywords your customers actually type are rarely the keywords you think they type. Use broad match to find those terms, then graduate the winners to exact match campaigns.
Fourth, we'd run YouTube and Demand Gen only when branded search and retargeting are already performing well. These channels amplify existing momentum — they don't create it from scratch. If your bottom-funnel campaigns aren't profitable yet, fix those first.
Fifth, we'd implement offline conversion tracking before launching any campaign. Every client in this study had it running, and it transformed the quality of Smart Bidding optimisation. Feeding real revenue data back to Google made a measurable difference within 3-4 weeks.
We'll audit your Google Ads setup, identify wasted spend, and show you the real commercial ROAS behind your campaigns — not the vanity metrics.