How to Measure Radio Advertising Performance: Complete Guide to Tracking ROI and Attribution
“I know half my advertising budget is wasted—I just don’t know which half.”
This famous quote (attributed to various advertising pioneers) doesn’t need to apply to your radio campaigns. With proper measurement infrastructure, radio advertising delivers clear, trackable results across multiple attribution points.
The challenge isn’t whether radio performance can be measured—it’s ensuring you’re tracking the right metrics with the right methods. This comprehensive guide explains exactly what to measure, how to measure it, which tools to use (many free), and how to calculate true ROI.
Why Radio Attribution Seems Challenging (And Why It Isn’t)
The perceived problem:
Unlike digital advertising where every click is tracked automatically, radio attribution requires intentional systems. This leads many businesses to assume radio is “unmeasurable”—but that’s a measurement design problem, not a radio problem.
The reality:
Radio is highly measurable when you implement proper attribution infrastructure. In fact, radio’s attribution is often MORE accurate than digital attribution because:
✅ Self-reported attribution is reliable: When customers say “I heard you on the radio,” they’re typically correct
✅ Behavioral signals are clear: Traffic spikes, search lift, and call volume patterns clearly correlate with spot schedules
✅ Multi-method validation: Combining multiple attribution approaches provides comprehensive picture
✅ Long-term impact trackable: Unlike 30-day cookie windows, radio builds awareness you can track across full customer journey
The key: Build measurement into your campaign from day one, not as an afterthought.
The Four-Tier Radio Measurement Framework
Effective radio attribution uses multiple methods across four tiers—each captures different aspects of performance:
Tier 1: Direct Response Attribution (Immediate Conversion)
What it measures: Customers who respond immediately after hearing your ad
Attribution Methods:
1. Self-Reported Source Tracking
How it works:
- Train ALL staff to consistently ask: “How did you hear about us?”
- Log response in CRM immediately
- Track weekly trends
Implementation:
✅ Good implementation:
- Mandatory field in CRM (can’t skip)
- Standardized response options (dropdown: Radio, Google, Facebook, Referral, etc.)
- Weekly reporting dashboard
- Staff training emphasizes importance
- Accountability for consistency
❌ Poor implementation:
- Optional question (staff forgets)
- Free-text field (inconsistent responses: “The radio,” “radio station,” “88.9,” “Shine”)
- Monthly review (too slow for optimization)
- No training or accountability
Expected capture rate: 25-40% of radio-driven leads will self-report “radio” when asked consistently.
Why not 100%: Some customers don’t remember specific source (especially with longer consideration cycles), others searched Google after hearing radio (report “Google” instead).
2. Campaign-Specific Phone Numbers
How it works:
- Unique phone number used ONLY in radio spots
- All calls to this number = attributed to radio
- Call tracking software logs details
Options:
Basic Approach (Free):
- Use separate business line for radio
- Forward to main line
- Track volume manually
Advanced Approach ($30-150/month):
- Call tracking platform (CallRail, CallTrackingMetrics, CallFire)
- Dynamic number insertion
- Call recording
- Keyword-level tracking
- Integration with CRM
Benefits:
- Precise attribution (100% of calls to this number came from radio)
- Call recordings enable quality control
- Time-of-day patterns validate spot schedule effectiveness
- Geographic data shows coverage reach
Considerations:
- Need memorable number for radio (vanity numbers recommended)
- Requires separate number in marketing materials (can’t use on website simultaneously)
- Small businesses may prefer single-number simplicity
Recommendation: Start with self-reported tracking (free), add call tracking if budget allows ($50-100/month tier provides substantial value).
3. Campaign-Specific URLs
How it works:
- Mention unique URL in radio spot (YourBusiness.ca/shine or YourBusiness.ca/radio)
- Google Analytics tracks traffic to this specific page
- Landing page = clear radio attribution
Implementation:
Step 1: Create Campaign Landing Page
- Duplicate your main service/contact page
- Change URL to campaign-specific slug
- Include campaign-specific offer from radio spot
- Add tracking parameters
Step 2: Set Up Google Analytics Tracking
- Create custom report for campaign URL
- Track: visits, bounce rate, conversions, time on site
- Set up goal tracking (form submissions, phone clicks)
Step 3: Optimize Landing Page for Radio Traffic
- Mobile-first design (70%+ of radio-driven traffic is mobile)
- Click-to-call button above fold
- Minimal form fields
- Fast load time (<2 seconds)
Expected traffic: Campaign URLs typically capture 15-25% of total radio-driven web traffic (remainder goes directly to homepage or searches Google).
4. Promo Codes
How it works:
- Mention unique code in radio spot (“Mention code SHINE25 for discount”)
- Track redemptions at point of sale
- Code used = attributed to radio
Best practices:
✅ Easy to remember: SHINE25, RADIO50, HEAT15
✅ Clear value: “$25 off” or “25% discount”
✅ Easy to apply: Works in online checkout AND phone orders
✅ Tracked in POS: Log every redemption with customer details
Advantages:
- Precise attribution
- Provides incentive (increases response)
- Works for e-commerce and traditional retail
- Easy for customers to use
Limitations:
- Some customers forget code (lost attribution)
- Some customers seek discount through other channels after hearing radio
- Requires staff training for consistent tracking
Expected capture: 30-50% of radio-driven customers will use promo code (higher with strong incentive).
Tier 2: Behavioral Signal Tracking (Indirect Validation)
What it measures: Changes in search and traffic patterns that validate radio’s impact even when customers don’t report source
5. Branded Search Volume Lift
How it works: Radio exposure drives people to search your company name → Google Search Console tracks these queries → compare volume during campaign vs. baseline
Setup Instructions:
Step 1: Access Google Search Console (Free)
- Verify ownership of your website at search.google.com/search-console
- Navigate to Performance → Search Results
- Filter by date range
Step 2: Establish Baseline
- Document branded search volume 2-4 weeks BEFORE campaign
- Branded queries = searches containing your company name
- Example: “Mountain View Heating,” “Mountain View HVAC Calgary”
- Record average daily/weekly volume
Step 3: Monitor During Campaign
- Track same branded queries weekly during campaign
- Look for lift percentage
- Note timing (does lift correlate with spot schedule?)
Step 4: Validate with Hold-Out
- If budget allows, run campaign for 4 weeks, pause for 2 weeks, resume for 4 weeks
- Branded search should drop during pause, resume when campaign returns
Expected lift: Research shows average 29% branded search increase during radio campaigns, with strongest lift in morning drive and midday dayparts.
Why this matters: Even if customers report “Google” as their source, radio drove them to search. Branded search lift validates radio’s impact independent of self-reported attribution.
Advanced Analysis:
Compare branded search lift to:
- Spot schedule (heaviest spot days should show highest search volume)
- Daypart (morning drive creates AM search spikes)
- Geographic patterns (Calgary searches should dominate if only advertising locally)
6. Direct Website Traffic Analysis
How it works: Google Analytics tracks “direct traffic” (people typing your URL directly or using bookmarks) → Radio encourages direct visits → Track increases
Setup in Google Analytics 4:
Step 1: Access Traffic Sources
- Navigate to Reports → Acquisition → Traffic Acquisition
- Compare date ranges (baseline vs. campaign period)
Step 2: Analyze Direct Traffic
- Filter to “Direct” channel
- Look for percentage increase during campaign
- Note time-of-day patterns
Step 3: Cross-Reference with Spot Schedule
- Plot daily direct traffic against spot schedule
- Heavy spot days should correlate with traffic spikes
- 24-hour lag normal (strongest effect within 24 hours of exposure)
Expected lift: Studies document approximately 52% direct traffic increase during radio campaigns.
Additional insights:
- Mobile vs. desktop (radio-driven = typically 70% mobile)
- New vs. returning visitors (radio builds new audience)
- Time on site and bounce rate (quality of traffic)
7. Organic Search Traffic Analysis
How it works: Radio builds brand awareness → People search your brand name → Click organic results (not ads) → Organic traffic increases
Why this matters: You’re not paying for these clicks. Radio drives “free” organic traffic by building awareness that leads to searches.
How to track:
- Google Analytics: Acquisition → Traffic Acquisition → Organic Search
- Compare campaign period vs. baseline
- Look for 20-40% lift during campaigns
Bonus benefit: Increased click-through rate on organic results improves SEO rankings (Google sees higher engagement, boosts rankings).
Tier 3: Lead Quality and Conversion Metrics
What it measures: Not just VOLUME of leads, but QUALITY and downstream conversion
8. Lead-to-Customer Conversion Rate
The question: Do radio-driven leads convert at higher/lower rates than other channels?
How to track:
Step 1: Tag Lead Source in CRM
- Mark every lead with source (Radio, Google Ads, Facebook, Referral, etc.)
- Track through entire pipeline
Step 2: Calculate Conversion Rates by Source
- Radio leads converted / Total radio leads = Radio conversion rate
- Compare to other channels
Step 3: Analyze Conversion Time
- How long from first contact to close?
- Does radio have longer/shorter sales cycle than other channels?
Common finding: Radio-driven leads often convert at HIGHER rates than cold digital traffic because:
- Pre-qualified (heard value proposition, chose to respond)
- Warmer (built familiarity through repeated exposure)
- Higher trust (local radio provides implied endorsement)
Example:
Channel: Google Ads
- Leads generated: 100
- Converted to customers: 8
- Conversion rate: 8%
Channel: Radio
- Leads generated: 75
- Converted to customers: 15
- Conversion rate: 20%
Insight: Despite fewer total leads, radio generated nearly 2x more customers due to higher conversion rate.
9. Average Order Value / Customer Lifetime Value
The question: Do radio-acquired customers spend more/less than customers from other channels?
How to track:
- Tag customer source in accounting/CRM system
- Calculate average first purchase by source
- Track repeat purchases and lifetime value by source
Common finding: Radio customers often have higher CLV because:
- Local connection = stronger loyalty
- Brand building creates premium perception
- Referred customers (radio listeners recommend businesses they heard about)
Example:
Channel: Google Ads
- Average first purchase: $850
- Repeat purchase rate: 15%
- Estimated CLV: $1,200
Channel: Radio
- Average first purchase: $1,050
- Repeat purchase rate: 28%
- Estimated CLV: $2,100
Strategic implication: Even if radio’s cost-per-lead is HIGHER than digital, better conversion rates and higher CLV can deliver better ROI.
10. Speed to Conversion
The question: How quickly do radio leads convert versus other channels?
Why it matters:
- Faster conversion = better cash flow
- Shorter sales cycle = lower cost to acquire
- Urgency indicators help optimize offers
How to track:
- Log date of first contact and date of conversion
- Calculate average days-to-close by channel
- Compare radio vs. other sources
Insight: Radio often drives faster conversion for:
- Immediate need categories (emergency HVAC, urgent dental)
- Seasonal/timely offers (pre-winter tune-up, tax season)
- Strong urgency mechanisms (“Book by Friday”)
Tier 4: Financial Performance Metrics
What it measures: Ultimate bottom-line performance—ROI, CAC, ROAS
11. Cost Per Lead (CPL)
Calculation: Total radio investment ÷ Number of attributed leads = CPL
Example:
- Radio investment: $5,000
- Attributed leads (all methods combined): 68
- CPL: $73.53
Context needed: Compare to other channel CPLs:
- Google Ads CPL: $120
- Facebook Ads CPL: $85
- Radio CPL: $73.53
12. Customer Acquisition Cost (CAC)
Calculation: Total radio investment ÷ Number of new customers = CAC
Example:
- Radio investment: $5,000
- New customers attributed to radio: 22
- CAC: $227
Context needed: Compare to:
- Customer Lifetime Value (CLV)
- Other channel CACs
- Your profit margins
Healthy ratio: CAC should be 1/3 or less of CLV for sustainable growth.
13. Return on Ad Spend (ROAS)
Calculation: Revenue attributed to radio ÷ Radio investment = ROAS
Example:
- Radio investment: $5,000
- Revenue from radio-attributed customers: $42,000
- ROAS: 8.4x (or 840%)
Interpretation:
- ROAS > 3x typically considered strong performance
- ROAS > 5x excellent
- ROAS 8-10x exceptional
Important note: Track immediate revenue AND projected lifetime value for complete picture.
14. Break-Even Analysis
The question: How many customers do you need from radio to break even?
Calculation:
Step 1: Determine gross profit per customer
- Average sale: $1,500
- Cost of goods/services: $800
- Gross profit: $700
Step 2: Calculate break-even customer count
- Radio investment: $5,000
- Break-even customers: $5,000 ÷ $700 = 7.14 customers
Step 3: Compare to actual results
- Actual new customers: 22
- Break-even: 7 customers
- Actual delivered 3x break-even volume
Strategic use: Set realistic expectations before campaign launches. “We need 8 new customers to break even—anything beyond that is profit.”
Building Your Attribution Dashboard
Multi-Method Attribution Summary
Create weekly tracking dashboard combining all methods:
Example Dashboard:
| Metric | Baseline | Week 1 | Week 2 | Week 3 | Week 4 | Week 5 | Week 6 |
|---|---|---|---|---|---|---|---|
| Self-Reported Leads | 0 | 8 | 12 | 15 | 18 | 16 | 14 |
| Campaign URL Visits | 0 | 42 | 67 | 89 | 102 | 94 | 88 |
| Promo Code Uses | 0 | 5 | 9 | 12 | 11 | 10 | 9 |
| Branded Search Volume | 18/wk | 22 | 29 | 34 | 38 | 35 | 31 |
| Direct Traffic (daily avg) | 45 | 58 | 72 | 78 | 84 | 79 | 73 |
| Total Phone Calls | 32/wk | 41 | 48 | 54 | 59 | 57 | 52 |
| New Customers | 3/wk | 4 | 6 | 8 | 9 | 8 | 7 |
Dashboard insights:
✅ Week 1: Initial response (early adopters, immediate needs)
✅ Weeks 2-4: Building momentum as frequency accumulates
✅ Week 4: Peak performance (optimal frequency achieved)
✅ Weeks 5-6: Sustained performance (some decline normal)
Key observations:
- All metrics show lift (validates radio impact)
- Self-reported leads = 24% of total response (remaining 76% came through other measurable channels)
- Combining ALL methods provides complete attribution picture
Common Attribution Pitfalls and How to Avoid Them
Pitfall 1: Only Tracking Self-Reported Attribution
Problem: “Only 15 people said ‘radio’ when we asked, so radio didn’t work.”
Reality: Self-reported attribution typically captures 25-40% of radio response. Remaining 60-75% arrives through branded search, direct traffic, and organic discovery.
Solution: Track ALL Tier 1 and Tier 2 metrics, not just “How did you hear about us?”
Pitfall 2: Judging Success Too Early
Problem: “It’s been 3 days and we haven’t seen results.”
Reality: Radio requires frequency building. Week 1 results rarely represent campaign potential.
Solution: Commit to minimum 4-6 week flights. Evaluate performance at weeks 3-4, not day 3.
Pitfall 3: Inconsistent Tracking
Problem: Only 50% of staff ask about source = only 50% of leads properly attributed.
Reality: Inconsistent tracking makes radio appear to underperform versus reality.
Solution: Make source tracking MANDATORY in CRM. Weekly audits. Train all staff. Emphasize importance.
Pitfall 4: Attribution Window Too Short
Problem: 30-day attribution window misses customers with longer consideration cycles.
Reality: B2B, healthcare, financial services often have 60-180 day decision timelines.
Solution: Track “first touch” separately from “last touch.” Many customers heard radio weeks/months before converting.
Pitfall 5: Ignoring Baseline Comparison
Problem: “We got 50 leads during the campaign” (but no context if that’s good or bad).
Reality: Without baseline, can’t determine if radio drove incremental leads.
Solution: Document 2-4 weeks of baseline data BEFORE campaign. Compare campaign performance to baseline, not to zero.
Pitfall 6: Not Tracking Lead Quality
Problem: “Radio generated lots of leads but they didn’t convert.”
Reality: May indicate offer/creative issue, not channel issue. OR radio leads may be higher quality (need to track conversion rate).
Solution: Track leads through full pipeline. Measure conversion rate, not just lead volume.
FAQ Section: Measurement & Attribution
Q: What’s the bare minimum I need to track to know if radio is working?
A: Three essential metrics provide basic validation:
1. Total inquiry volume
- Count ALL phone calls, web form submissions, walk-ins
- Compare campaign weeks to baseline (pre-campaign weeks)
- Look for 20-40% lift during campaign
2. “How did you hear about us?” responses
- Consistently ask every contact
- Log in spreadsheet (or CRM if available)
- Aim for 25-40% self-reporting “radio”
3. Google Search Console branded search volume
- Free tool at search.google.com/search-console
- Check weekly branded search count
- Look for 20-30% lift during campaign
If these three metrics ALL show lift during campaign, radio is working—even if you don’t have sophisticated attribution infrastructure.
Implementation time: 2-3 hours to set up all three methods.
Q: How do I handle attribution when customers call weeks or months after hearing my radio ad?
A: Two-question approach captures both immediate and delayed attribution:
Question 1: “How did you first hear about us?”
- Captures initial awareness source
- Many will recall “radio” even months later
- Documents first touch in customer journey
Question 2: “What triggered you to contact us today?”
- Captures immediate reason for action
- May be different from initial awareness (“saw your Google review” after initially hearing radio)
- Documents last touch
Example conversation:
Customer: “I’d like to schedule a consultation.”
You: “Great! How did you first hear about our company?”
Customer: “I heard you on Shine FM a couple months ago.”
You: “Thanks! And what specifically triggered you to call today?”
Customer: “My neighbor mentioned they used you and I remembered hearing your ads.”
Attribution insight:
- First touch: Radio (builds awareness)
- Last touch: Referral (triggers action)
- Both matter: Radio created awareness that led to referral opportunity
CRM tracking: Log BOTH sources. Create custom fields:
- “First Touch Source”
- “Last Touch Source”
- “Days from First Touch to Conversion”
Analysis:
- Track how many customers report radio as first touch (even if converting later via other triggers)
- Calculate average time from radio awareness to conversion by category
- Proves radio’s long-term brand-building value
Q: My customers say they found me on Google after hearing my radio ad. Is that a Google lead or radio lead?
A: That’s a radio lead—Google was just the pathway, radio was the driver.
The attribution logic:
Scenario 1: Customer Journey WITHOUT Radio
- Never heard of your business
- Searches generic keyword (“Calgary HVAC repair”)
- Discovers you through Google Ads or organic results
- Attribution: Google (true discovery channel)
Scenario 2: Customer Journey WITH Radio
- Hears your radio ad
- Searches YOUR COMPANY NAME on Google
- Clicks your organic result or ad
- Attribution: Radio (drove the branded search)
The difference:
- Discovery search (generic): Google gets attribution
- Navigation search (branded): Radio gets attribution
How to track this properly:
Method 1: Google Search Console Analysis
- Compare branded vs. non-branded search volume
- Radio drives branded searches up 29% on average
- These branded searchers = radio-driven traffic
Method 2: Google Ads Account Analysis
- Check branded keyword performance during campaign
- Look for click volume increase on “[Your Company Name]” keywords
- This incremental branded search = radio-driven
Method 3: Train Staff to Ask Clarifying Question When customer says “Google,” ask: “How did you know to search for our company specifically?”
- If answer is “I heard you on the radio,” that’s radio attribution
- If answer is “I searched for ‘Calgary dentist’ and you came up,” that’s Google attribution
Pro tip: Create two “Google” subcategories in your CRM:
- “Google – Branded Search” (radio-driven)
- “Google – Generic Search” (true Google discovery)
Q: Should I use call tracking software, or is asking “How did you hear about us?” enough?
A: Start with asking—it’s free and captures majority of direct response. Add call tracking when budget allows.
“How did you hear about us?” (Free)
Advantages:
- Zero cost
- Captures 25-40% of radio response
- Works for phone, web, and in-person inquiries
- Easy to implement immediately
Limitations:
- Requires consistent staff compliance
- Some customers don’t remember/report accurately
- Misses indirect response (branded search after radio)
Call Tracking Software ($30-150/month)
Advantages:
- 100% accurate for phone attribution
- Call recording enables quality control
- Time-stamp data shows daypart effectiveness
- Geographic data validates coverage
- Integration with CRM
Limitations:
- Monthly cost
- Setup time required
- Need unique phone number for radio (can’t use on website simultaneously unless using dynamic insertion)
Recommendation:
Phase 1 (First 1-3 months radio):
- Use “How did you hear about us?” consistently
- Track branded search lift in Search Console
- Monitor website traffic patterns in Analytics
- Combined cost: $0
Phase 2 (If radio validates performance):
- Add call tracking software (~$50-100/month)
- Implement campaign-specific URLs with better tracking
- Set up automated reporting dashboard
- Combined cost: $50-150/month
Don’t let lack of expensive tracking prevent you from starting radio. Basic methods provide sufficient validation—upgrade as performance justifies.
Q: How do I calculate ROI when I’m unsure which leads came from radio?
A: Use conservative attribution combined with total lift analysis:
Method 1: Conservative Direct Attribution (Minimum ROI)
Count only leads with clear radio attribution:
- Self-reported “radio” (25-40% typically)
- Campaign URL visits that converted
- Promo code redemptions
Calculate minimum ROI:
Example:
- Radio investment: $5,000
- Clearly attributed leads: 28
- Converted customers: 9
- Average revenue: $2,800
- Total revenue: $25,200
- ROI: 5.04x ($25,200 / $5,000)
This is your FLOOR—minimum provable ROI.
Method 2: Total Lift Analysis (Maximum ROI)
Compare TOTAL business during campaign vs. baseline:
Example:
- Baseline new customers (4 weeks prior): 12
- Campaign new customers (4 weeks during): 31
- Incremental customers: 19 (31 – 12 = 19)
- Average revenue: $2,800
- Incremental revenue: $53,200
- ROI: 10.64x ($53,200 / $5,000)
This is your CEILING—maximum possible radio contribution.
The Reality: Probably Somewhere in Between
Blended Attribution: Not ALL incremental customers came from radio (some baseline growth expected), but not ALL radio impact was captured in direct attribution (branded search, word-of-mouth amplification).
Reasonable estimate: Split the difference
Conservative ROI: 5.04x
Maximum ROI: 10.64x
Blended estimate: ~7.5-8x
Strategic Decision:
- If conservative ROI alone justifies investment → proceed with confidence
- If blended ROI required to justify → monitor closely, optimize aggressively
- If even maximum ROI doesn’t justify → wrong channel or needs creative/offer improvement
Q: What if I advertise on multiple stations or multiple channels simultaneously? How do I isolate radio’s contribution?
A: Multi-channel attribution requires either isolation testing OR statistical modeling:
Option 1: Sequential Testing (Cleanest Attribution)
Approach: Run channels sequentially, not simultaneously:
- Months 1-2: Radio only
- Months 3-4: Pause radio, run Google Ads only
- Months 5-6: Radio only again
- Compare performance across periods
Advantages:
- Crystal clear attribution
- Definitively proves each channel’s contribution
- Easy to implement and understand
Disadvantages:
- Sacrifices combined channel synergy (radio + digital often better than either alone)
- Requires patience (longer timeline to test all channels)
- Some seasonality effects may confound results
When to use:
- Testing radio for first time
- Limited budget (can’t run multiple channels at once)
- Need definitive proof for stakeholders
Option 2: Geographic Testing (Split Markets)
Approach: If you serve multiple geographic markets:
- Calgary: Radio + Digital
- Edmonton: Digital only (control group)
- Compare performance between markets
Advantages:
- Tests channels simultaneously
- Proves radio’s incremental contribution
- Maintains business as usual in both markets
Disadvantages:
- Requires multi-market presence
- Markets must be similar enough for valid comparison
- Can’t test every channel combination
When to use:
- Multi-location businesses
- Regional/provincial operations
- Sufficient budget to run both markets
Option 3: Day-of-Week Testing
Approach:
- Radio runs Monday-Thursday
- No radio Friday-Sunday
- Compare lead volume Monday-Thursday vs. Friday-Sunday
- Account for any natural day-of-week patterns
Advantages:
- Tests within same market and time period
- Relatively quick results (4-6 weeks validates pattern)
- Maintains other channels consistently
Disadvantages:
- Requires careful statistical analysis
- Natural day-of-week variations must be accounted for
- Radio’s awareness effect may carry into “off” days
When to use:
- Single-market businesses
- Running multiple channels simultaneously
- Need quick validation
Option 4: Statistical Modeling (Advanced)
Approach: Use marketing mix modeling (MMM) or multi-touch attribution (MTA) tools:
- Tracks all customer touchpoints
- Statistical algorithms assign credit to each channel
- Provides percentage contribution by source
Tools:
- Google Analytics 4 (data-driven attribution model)
- Wicked Reports
- Attribution.com
- HubSpot (with multi-touch attribution add-on)
Advantages:
- Sophisticated analysis
- Accounts for channel interactions
- Continuous optimization insights
Disadvantages:
- Expensive (tools $200-2,000+/month)
- Requires significant data volume (100+ conversions/month minimum)
- Complex setup and interpretation
When to use:
- Large businesses with complex marketing mix
- Sufficient budget for attribution tools
- High conversion volume
Practical Recommendation for Most Calgary Businesses:
Start with Sequential Testing:
- Month 1: Establish baseline (current marketing only)
- Months 2-3: Add radio (all else constant)
- Measure lift versus baseline
Then Optimize with Multi-Channel:
- Once radio validates, run radio + digital together
- Use conservative direct attribution for radio (self-reported, campaign URLs, promo codes)
- Accept that true ROI likely higher than direct attribution shows
Focus on Total Business Growth: Ultimately, the question isn’t “exactly what percentage came from radio vs. Google?”—it’s “did my total customer acquisition improve and at what cost?”
If radio + digital together drives better results at acceptable CAC, the precise attribution split matters less than the combined performance.
Call to Action
Need help building attribution infrastructure for your Calgary radio campaign?
Contact Jodi Morel at IDMD Brand Management
We’ll set up tracking systems, dashboards, and reporting to clearly measure your radio ROI.









