Overview
If you’re evaluating a programmatic advertising agency, this guide helps you make a confident, defensible choice. It covers what agencies actually do, how to compare partners, what you should pay, and how to implement quickly without sacrificing governance.
You’ll get a pricing explainer with example math, a weighted selection scorecard, an RFP checklist, channel benchmarks, and a 30/60/90‑day onboarding plan.
A strong programmatic ad agency (also called a programmatic media agency or managed programmatic services partner) should deliver access to premium inventory and identity solutions, a transparent fee model, rigorous brand safety and fraud prevention, and an accountable measurement framework. Use this as your evaluation blueprint and a checklist for launch readiness.
What a programmatic advertising agency does (and how it differs from traditional media and in‑house)
A programmatic advertising agency plans, traffics, and optimizes digital media bought through demand‑side platforms (DSPs) across display, video/CTV, audio, DOOH, in‑app, in‑game, and retail media. Unlike traditional spot buys, programmatic is auction‑based and data‑driven. It enables precise audience, context, and supply path controls at the impression level.
Versus legacy media buying, a programmatic ad agency runs trading operations inside DSPs. It manages pixels and event taxonomies, integrates CRM/CDP data, sets brand safety and ad fraud safeguards, and builds test‑and‑learn roadmaps. The difference shows up in transparency (log‑level data options), speed of optimization, audience and creative personalization (including DCO), and advanced measurement such as incrementality testing or MMM calibration.
Compared with an in‑house desk, agencies bring battle‑tested playbooks across multiple verticals and up‑to‑date platform certifications. They offer flexible access to multiple DSPs, identity partners, and verification stacks. You trade some direct control for time‑to‑value, expanded tooling, and bench depth, which is especially useful for CTV, retail media, and international buying.
In‑house trading desk vs hiring an agency: costs, speed, control, and performance
The in‑house vs agency decision hinges on total cost, speed to launch, platform access, and your appetite for building a specialized media function. If you have persistent scale, a mature analytics stack, and the need for custom data integrations, in‑house can pay off. If you need faster learning cycles, specialized channels (CTV/retail media), or turnkey fraud controls, an agency is usually more efficient.
In‑house desks demand a full stack of talent—programmatic traders, analytics, ad ops, data engineers—and multiple platform contracts. Agencies amortize these investments across clients and can bring multi‑DSP optionality on day one. Many brands ultimately land on a hybrid: retain strategic ownership and first‑party data internally while using a programmatic media agency for activation and specialized channel execution.
When assessing performance potential, consider lift and risk. Agencies with proven test designs tend to ramp faster and avoid common pitfalls (e.g., over‑crediting retargeting, low‑quality supply). In‑house teams can optimize for long‑term data value but face a slower build and talent retention risk. An honest ROI comparison should model 12–24 months, including tools, headcount, and foregone learning.
- Choose in‑house when: you have sustained spend, executive cover to build a trading team, strong internal analytics, and specific data/IP considerations.
- Choose an agency when: you prioritize speed, need multi‑channel expertise (CTV/DOOH/retail media), seek multi‑DSP leverage, or require robust brand safety and compliance out of the box.
How to evaluate and choose a programmatic advertising agency
Evaluate agencies across transparent pricing, platform and verification stack access, identity/cookieless capability, measurement sophistication, supply path optimization, and fit with your vertical and markets. Ask for specific examples, not general claims. Require a documented operating model for test velocity, optimization guardrails, and QBR cadence.
Decision hygiene matters as much as criteria. Use a weighted scorecard and run a structured RFP. Bring the same creative, data, and targeting constraints to each finalist to enable apples‑to‑apples performance projections. Require disclosure on data ownership, log‑level access, cancellation terms, and the exact DSP seat and verification partners that will be used.
Weighted selection scorecard (criteria and example scoring)
Start with a clear weighting so subjectivity doesn’t creep in at the finish line. The following is a practical template you can adapt to your priorities:
- Platform and supply: multi‑DSP access and rationale; SPO policy; private marketplace relationships (20%)
- Brand safety and fraud: pre‑bid/post‑bid stack, MFA avoidance, allow/blocklists; TAG status (15%)
- Identity and cookieless: first‑party onboarding, UID2, clean rooms, Seller Defined Audiences, Privacy Sandbox plans (15%)
- Measurement: incrementality testing, MMM/MTA support, lift/attention studies; log‑level enablement (15%)
- Pricing transparency: fee model clarity; tech CPMs disclosed; minimums; pass‑through terms (15%)
- Vertical expertise and case proof: channel/geo fit; healthcare/finance compliance if relevant (10%)
- Team quality and process: certified traders; QA, pacing, and change management; QBRs (10%)
As an example, if Agency A excels in identity and brand safety but is single‑DSP, while Agency B offers multi‑DSP and stronger MMM/attention support, your scoring will reveal trade‑offs. Apply the weights above to each criterion on a 1–5 scale. The composite will keep the decision objective when presentations all sound stellar.
RFP checklist and questions to ask in pitches and QBRs
Your RFP should force specificity on pricing, tech stack, and operating rhythm. It also sets governance expectations you will revisit in QBRs.
- Exact DSP seat(s) proposed per channel and why; confirm whether you get log‑level data access and under what terms
- Full fee stack (percent of spend, tech CPMs, verification CPMs, data fees), minimums, and cancellation terms
- Identity plan: first‑party onboarding flow, consent handling, UID2, data clean room options, and Privacy Sandbox readiness
- Fraud/brand safety: pre‑bid filters, post‑bid verification partners, MFA policy, allow/blocklists, and SPO commitments
- Measurement: incrementality test design, attribution windows, lift/attention studies, MMM/MTA support, learning agenda
- Onboarding: 30/60/90‑day plan, pixels and event taxonomy, QA checklist, creative/DCO specs, and test cadence
- Reporting/QBRs: weekly diagnostics cadence, executive summary format, KPI ladders by funnel, roadmap governance
Insist on a sample media plan with your audience inputs, a redacted case study in your vertical, and the first three tests they would run in month one. In QBRs, hold them to the learning agenda and optimization roadmap they commit to in pitch.
Team structure and certifications that signal quality
Certifications show that the people trading your budget are current on platform and policy changes. Ask for named leads, role coverage, and the backup bench.
- The Trade Desk Edge Academy certifications (including Trading or Advanced) for traders and strategists
- Google DV360 certifications for buyers; Google Analytics/GMP credentials for analytics integration
- IAB Digital Media Sales/Buying and IAB Tech Lab standards literacy (Seller Defined Audiences, ads.txt/app‑ads.txt)
- Verification partner training (e.g., DoubleVerify, IAS) for brand safety specialists
- Privacy and compliance familiarity for regulated verticals (HIPAA, GDPR, CCPA/CPRA) via internal training or counsel
Confirm who actually executes day to day. Clarify how QA and change management are enforced. Ask how coverage works across time zones and holidays.
Transparent pricing and fee models with example math
You should know exactly where each dollar goes across media, technology, verification, data, and agency fees. The most common structures are percent‑of‑spend, tech CPMs added to media, or hybrid combinations. Each can be fair if disclosed and right‑sized for your budget.
To understand performance impact, translate fees into effective CPM and then into expected CPA/ROAS. For example, if open‑web display media CPM is $3.00 and stacked tech/verification adds $1.50, your effective CPM is $4.50 before agency fees. If the agency charges 15% of spend, effective CPM rises to about $5.18. With a 0.3% CTR and a 4% landing‑page conversion rate, the modeled CPA is roughly $43 per acquisition. Small changes in stack costs or conversion lift can materially move CPA—so request sensitivity scenarios.
Percent of spend, tech CPMs, hybrid, and minimums
Each model comes with trade‑offs; the best one aligns incentives to your objective and spend tier.
- Percent of spend: Simple and scalable; common at 10–20% of media. Works well for mid‑to‑high spend where tech CPMs can be negotiated down. Watch for double‑charging if separate tech CPMs are also added.
- Tech CPMs (plus low management fee): Transparent pass‑through of DSP, verification, and data costs (e.g., $0.50–$2.00 CPM per component) with a lower management fee (5–10%). Great for finance teams that want line‑item clarity.
- Hybrid: A modest percent‑of‑spend plus capped tech CPMs to balance incentives and protect small campaigns from fee bloat.
- Minimums: Common monthly minimums range from $10k–$50k in media for open web; CTV/retail media often require higher. Ensure minimums ratchet down if activation is narrower than planned.
Worked example: targeting 10M impressions at a $3.00 media CPM and a $1.50 tech/verification CPM yields $30,000 in media and $15,000 in tech. With a 15% management fee on media ($4,500), the all‑in monthly cost is approximately $49,500. At $250k/month, negotiation may cut tech to ~$1.00 eCPM and management to ~12%, reducing the all‑in rate materially. Ask your agency to show fee‑stack scenarios at $50k, $250k, and $1M to see scalability.
Contracts, SLAs, transparency, reporting cadence, and QBR agenda
Your MSA/SOW should enshrine transparency, data rights, and service levels. Require explicit terms for data ownership (you own first‑party and any derived segments), log‑level access (format, frequency, and retention), cancellation windows, and change control for tech stack substitutions.
Set clear SLAs for trafficking turnaround, pixel/event updates, brand safety incident response, optimization pacing, and reporting cadence. QBRs should follow a consistent agenda: business context, performance vs targets, diagnostic deep dives (audience, supply, creative), tests completed/next, and a prioritized optimization roadmap with owners and dates.
A practical contract checklist keeps governance tight while staying agile.
- Data rights: advertiser owns all campaign data; log‑level access enabled for agreed use cases and safe transfer
- SPO and inventory policy: document default exchanges, MFA rules, allowlists/blocklists; process for exceptions
- Verification stack: pre‑bid and post‑bid partners specified; viewability standards; incident response timelines
- Identity and privacy: consent handling, DPA/DTIA if needed, and retention windows aligned to GDPR and CCPA
- Fees and minimums: full stack disclosure; tech changes require prior approval; cancellation and ramp‑down terms
- Reporting/QBRs: weekly diagnostics, monthly executive roll‑ups, and quarterly strategy reviews with learning agenda
DSP access and selection guidance (DV360 vs The Trade Desk vs Yahoo DSP and others)
The right DSP depends on your channels, identity needs, and geographies. Most mature agencies maintain access to multiple platforms to match use cases. DV360 is strong for the Google ecosystem and YouTube/CTV reach. The Trade Desk excels at open‑web scale and UID2 identity leadership. Yahoo DSP can deliver cost‑efficient video/CTV and native, plus specialized DSPs cover retail media and DOOH.
Ask how the agency chooses a DSP per objective and what private marketplaces and data integrations they can unlock. Clarify whether you’ll operate under the agency seat or a client‑owned seat, what that means for data portability, and whether log‑level data will be available. For global buys, ensure local supply, language targeting, and compliance coverage.
- Choose DV360 when: you need YouTube and broad CTV access, strong brand suitability controls, and tight Google Marketing Platform integrations.
- Choose The Trade Desk when: you want superior open‑web reach, robust CTV supply, advanced identity (e.g., Unified ID 2.0), and refined retail media and CTV options through curated marketplaces.
- Choose Yahoo DSP or others when: you want competitive pricing on video/CTV/native, unique supply relationships, or specific format strengths; or you’re activating niche channels like DOOH via specialized platforms.
Identity and cookieless strategy: first‑party data, UID2, clean rooms, and Privacy Sandbox
A future‑proof programmatic strategy hinges on first‑party data and interoperable IDs rather than third‑party cookies. Your agency should define how your CRM/CDP audiences are onboarded with consent. It should show how they’ll scale using interoperable IDs like Unified ID 2.0, how clean rooms are used for measurement or partner activation, and how contextual and publisher‑defined signals fill gaps.
Expect a multi‑pronged playbook: authenticated identity (UID2), publisher taxonomies (IAB Tech Lab Seller Defined Audiences), high‑quality contextual, and platform APIs such as the Google Privacy Sandbox for audience and attribution in Chrome. For sensitive data, clean rooms enable privacy‑safe joins with walled gardens or retailers without exposing raw PII.
- First‑party onboarding: map fields, confirm consent and retention, and set a refresh cadence; track match rates by channel.
- UID2 and other interoperable IDs: extend reach to authenticated inventory on the open web and CTV while maintaining consent controls.
- Data clean rooms: conduct overlap analysis, create measurement cohorts, and support retailer/walled‑garden collaborations.
- Contextual and publisher signals: use Seller Defined Audiences and high‑fidelity contextual to maintain scale without cookies.
- Sandbox readiness: plan tests for Protected Audiences, Topics, and Attribution Reporting where relevant.
Note: under GDPR and CCPA, personal data processing requires a lawful basis and consumer rights handling; ensure your agency’s processes reflect these obligations in consent, retention, and subject rights workflows.
Ad fraud prevention and brand safety framework (pre‑bid, post‑bid, MFA avoidance, SPO)
A robust program protects brand equity and budget using layered controls. Pre‑bid filters screen invalid traffic (IVT), malware, and unsuitable content. Post‑bid verification monitors what gets through. Supply path optimization (SPO) reduces hops and low‑quality resellers. Your agency should be conversant in TAG standards, maintain allowlists/blocklists, and publish an MFA (made‑for‑advertising) policy.
Verification partners like DoubleVerify provide pre‑bid segments and post‑bid reporting across viewability, fraud, and brand safety. Agencies should maintain TAG certifications where applicable and document response processes for unsafe placements (TAG). Commit to an SPO policy that prefers direct paths to premium publishers and prunes resellers with poor quality or transparency.
- Pre‑bid: IVT, brand safety/suitability tiers, geo and device validation, and MFA exclusion rules
- Post‑bid: continuous verification, unsafe page removal, make‑good management, and pattern analysis for future filters
- MFA and allowlists: enforce strict allowlists for CTV/PMPs; dynamic blocklists for risky domains/apps
- SPO: limit resellers, prefer direct paths, monitor auction duplication, and demand ads.txt/app‑ads.txt compliance
Measurement beyond reporting: incrementality, MMM vs MTA, brand lift, attention
Performance dashboards aren’t enough; you need a blueprint for causal learning. Your agency should propose incrementality testing—geo holdouts, audience split tests, or PSA/ghost ads—to separate real lift from retargeting or last‑click bias. Define attribution windows that reflect your sales cycle and use lift studies to read upper‑funnel media like CTV and audio.
MMM (marketing mix modeling) is the right tool for long‑cycle and offline sales. It provides channel‑level elasticities and budget scenarios. MTA (multi‑touch attribution) helps with digital path analysis but is constrained by signal loss and walled gardens. Treat it as a diagnostic, not a single source of truth. Attention metrics (e.g., time‑in‑view, interaction rates) can improve creative and supply decisions when aligned to outcomes.
Establish a quarterly learning agenda tied to business questions. Which audiences drive incremental sales? What frequency caps minimize waste? Which supply paths boost attention without inflating CPMs? How does CTV contribute to search and site conversion? Codify go/no‑go criteria and pre‑register tests so results drive decisions, not narratives.
Channel coverage and when to use each (CTV/OTT, DOOH, audio, retail media, in‑app, in‑game)
Pick channels for the job. CTV excels at broad reach and upper‑funnel lift. Retail media and high‑intent display/video drive mid‑to‑lower funnel. In‑app/in‑game can reach niche or younger cohorts at scale. Audio/podcasts balance cost‑effective reach with contextual strength. DOOH offers localized impact and footfall measurement.
Mind each channel’s pitfalls. CTV requires strict supply curation to avoid device spoofing and MFA. Retail media needs clean taxonomy and SKU‑level attribution. DOOH relies on accurate polygoning and visit measurement. Audio needs brand safety vetting for user‑generated content. In‑app and gaming require SDK and placement transparency and strong fraud filters. Ensure creative is native to the format and sequenced across the funnel.
KPI and benchmark ranges by channel and industry
Use ranges to plan and then localize with your own data. Actuals vary by market, seasonality, and targeting strictness.
- CTV/OTT: CPMs $20–$45 for premium; completion rates 90%+; expect brand lift over direct CPA. Strong for retail, auto, CPG; B2B uses account‑based targeting on curated supply.
- Display (open web): CPMs $2–$6; CTR 0.08–0.30%; viewability 60–80% with proper supply filters. MRC defines viewability as 50% of pixels in view for at least 1 second for display and 2 seconds for video (MRC viewability standards).
- Online video: CPMs $8–$25; completion rates 40–80% depending on skippable vs non‑skippable; expect assisted conversions and search lift.
- Audio/podcasts: CPMs $18–$35; completion 85–98%; strong for awareness and mid‑funnel recall in finance, healthcare, and SaaS thought leadership.
- DOOH: CPMs $5–$20 (effective); best judged by reach, impressions, and modeled footfall lift; multi‑location retail and QSRs benefit from proximity tactics.
- Retail media networks: CPCs $0.50–$3.00 for sponsored products; ROAS varies by category; upper‑funnel display/video CPMs $8–$25 with strong closed‑loop sales attribution.
- In‑app/in‑game: CPMs $3–$12; viewability and fraud controls are essential; performance varies widely by exchange and placement.
These are planning anchors. Align KPIs to funnel stage, set guardrails for viewability and brand safety, and use incrementality tests to confirm contribution. For viewability, use MRC thresholds as your baseline and raise targets where inventory allows.
Creative strategy and DCO: feed‑based, sequential messaging, testing cadence
Creative is your biggest performance lever once supply is clean. Map messages to funnel stages: problem/brand value at the top, proof and differentiation mid‑funnel, and urgency or offer near conversion.
Use sequential storytelling to move users through the journey and vary formats by channel (sight, sound, and motion in CTV; utility and clarity in display; narrative hooks in audio). Dynamic creative optimization (DCO) amplifies relevance by swapping images, copy, and CTAs based on audience, context, or product feed.
Set a disciplined test cadence. Test one variable at a time, use minimum sample sizes, and pre‑register hypotheses. Retire creative quickly when attention drops and rotate winners into new narratives.
- DCO triggers to consider: geo and weather, product/category interest, cart/CRM status, publisher/contextual category, and funnel stage
Onboarding and implementation: 30/60/90‑day timeline, pixels, event taxonomy, QA
A tight launch plan balances speed with quality. Lock creative, data, and measurement foundations first, then scale channels with governance intact. Define go‑live criteria early: tagging verified, brand safety rules enforced, budget pacing checks in place, and the first three tests queued.
- Days 0–30: Contracting and access; DSP seat confirmation; pixel and CAPI deployment; event taxonomy finalized (pageview, product view, add‑to‑cart, lead, purchase with values); consent strings validated; baseline reporting; creative/DCO specs; brand safety/SPO policies; QA of tags, deduping, and offline conversions if needed
- Days 31–60: Soft launch in 1–2 priority channels; verification tuning; initial creative and audience A/B tests; bid and frequency calibration; weekly diagnostics; establish executive dashboard; document early learnings; prepare CTV/retail media/DOOH expansions
- Days 61–90: Scale across approved channels; implement incrementality test (geo or cohort split); introduce sequential creative; QBR with roadmap updates; decide on log‑level export cadence; audit identity performance (match rates, UID2 reach), and refine SPO
By day 90, you should have a stable operating rhythm, a live learning agenda, and initial performance proofs by channel and audience. Resist the urge to launch everything on day one. Stagger to preserve clean readouts.
Budgeting and sample media plans by funnel stage and geo
Budget to your objectives and market size, not to arbitrary channel splits. A common pattern anchors awareness with high‑quality CTV or online video in top markets. Support with contextual and high‑viewability display. Convert demand with retargeting and retail media where applicable.
As a rule of thumb, CTV/DOOH need threshold spends to achieve reach and frequency. Underfunded CTV risks high CPMs with minimal impact.
Sample full‑funnel allocation for a national brand at $250k/month: 35% CTV/OTT for reach; 25% online video; 20% high‑quality display/contextual; 10% retail media or commerce media; 10% retargeting and CRM reactivation. For a regional test across 5 DMAs at $75k/month, lean heavier into online video and high‑impact display (skip CTV if it can’t clear reach thresholds), maintain a small always‑on retargeting line, and invest in DOOH only if you can polygon key trade areas with footfall measurement.
Match KPIs to stage: awareness (reach, on‑target %, viewability/attention, brand lift), consideration (site engagement, aided recall, view‑through), and conversion (CPA/ROAS, incrementality). Rebalance quarterly using MMM or contribution analyses and codify changes in your QBR roadmap.
