What 6 real B2B ad campaigns got right (and wrong)



Hello and welcome to the Exit Five Weekly Newsletter — read by 42,000 B2B marketing professionals around the world. Exit Five is a membership site designed to help you build a successful career in B2B marketing. Join 5,700 other members at exitfive.com.
By the way — this email was designed in Knak, which helps you create email and landing pages in minutes without having to write code. Learn more about Knak here.
TOGETHER WITH CUSTOMER.IO
🧪 10 Growth Marketing Experiments Worth Stealing

Want to skip the trial-and-error and get right to what works?
The team at Customer.io just dropped a new playbook that documents 10 growth marketing experiments they’ve run across the full customer lifecycle, from acquisition to churn and win-back.
What’s inside isn’t fluffy “tips.” These are real experiments with clear hypotheses, tested methodologies, and results.
You’ll learn:
- How one change to a trial expiration email increased checkout rates by 46%
- Why a plain-text welcome email outperformed its prettier HTML version in long-term revenue
- The small tweak that boosted activation rates by 417% (yes, you read that right)
- And how a multi-stage churn campaign uncovered the hidden metric that best predicts churn
It’s the kind of resource that sharpens your thinking, and gives you the ammo you need to run smarter experiments yourself.
Grab the playbook here and bring a few of these ideas into your next growth meeting.
📊 6 Real B2B Ad Campaigns: What Actually Worked (And What Didn’t)
.png)
Pranav Piyush has 15 years of growth and marketing experience at companies like PayPal, Dropbox, Adobe, and BILL. And he kept seeing the same problem over and over again: marketers making million-dollar decisions based on terrible data.
So he started Paramark to run proper experiments for B2B companies. We're talking controlled tests with statistical significance (the kind that actually prove causation, not just correlation).
And recently, he shared results from real campaigns his team has tested for venture-backed companies. These aren't cherry-picked wins or theoretical case studies. They're controlled experiments that show what actually drives results.
So whether you're managing millions or testing with a smaller budget, these examples will change how you think about ad spend.
1. The Branded Search Reality Check
Two venture-backed SaaS companies (Series C and Series E) cut branded search ads to zero in select geos for 4-6 weeks.
The Result: Zero impact on business metrics. All traffic shifted to organic search.
The Savings: Millions annually.
The Lesson: Test this immediately. Pick a few states, turn off branded search, and measure actual conversion metrics.
2. Billboards That Actually Convert
A FinTech company ran digital out-of-home ads in specific target markets only.
The Result: Clear lift in applications visible immediately in test geos vs. control.
Why It Worked: They were saturated on Google search and needed new growth channels.
The Lesson: Test geo-targeted outdoor advertising. Use platforms like Quivr or OneScreen.
3. YouTube's Split Personality
Two companies tested YouTube ads with identical geo-holdout methodology.
Company A Result: Clear lift in MQLs, positive ROI.
Company B Result: Zero impact despite $1M spend.
The Difference: Company B had a strong organic YouTube presence. Company A had none.
The Lesson: Audit your organic presence before paying for ads.
4. The CTV Win (Small Budget Edition)
A Series F SaaS company tested Connected TV with just a $30-40K budget.
The Result: Clear lift despite higher noise from smaller spend.
Why It Worked: They were hitting saturation on direct response channels and needed top-of-funnel expansion.
The Lesson: You don’t need millions. Start with $30K focused on specific geos.
5. The Expensive Multi-Channel Mistake
A Sequoia-backed company launched YouTube, billboards, CTV, and Meta simultaneously.
The Result: Clear lift, but cost per conversion was 10x higher than other channels.
The Lesson: Test channels individually before layering them.
6. Performance Max Redemption
A large company tested Google’s Performance Max in 10 states.
The Result: Positive MQL lift when excluding branded inventory.
The Lesson: Configure it correctly, exclude brand terms and focus on fresh surfaces.
The Bottom Line? Real testing means one variable, one geography, one time period. Paramark doubled their traffic starting with an Exit Five sponsorship, by testing one channel at a time.
📺 UPCOMING EVENTS
Exit Five Pro Members Only
- July 11th: Non-SaaS Marketing Meetup: How to Master B2B Virtual Events
- July 22nd: 3 Smart Ways Marketers Are Actually Using AI
- July 23rd: Women in B2B: Setting Boundaries and Expectations, Professionally and Respectfully
CMO Club Members Only
- July 10th: AMA about ABM with Mason Cosby, CEO & Founder of Scrappy ABM
- July 31st: AI Spotlight - How Are B2B Marketing Leaders Using AI?
Not a member yet? Learn more about an Exit Five Membership here.
If you are an Exit Five member, click here to RSVP to these events.
📚 LATEST CONTENT
Here's the latest from the Exit Five content library:
- 🎧 Why I Love Marketing (Quick Voice Note From Dave)
- 🎧 The Future of B2B Marketing: AI, Execution, and Craft with Kieran Flanagan
- 🎧 Email Deliverability: What Every B2B Marketer Needs to Know
- 📰 Marketing Is Broken: Why You’re So Busy and Still Not Getting Anything Done
- 📰 How to Put On a Virtual Event People Actually Want to Attend
- 📰 The LinkedIn Playbook Every B2B Marketer Should Steal
🪪 BECOME AN EXIT FIVE MEMBER
What’s actually inside the Exit Five Membership?

We made you this 131-second video to show you around. Have a peek. 👀
Join 5,700 Other Exit Five Members
Thanks to our friends over at Knak for being our 2025 newsletter sponsor.

Want to sponsor a future newsletter or learn more about other sponsorship opportunities with Exit Five? Email hi@exitfive.com (or just reply to this emai
