How to Vet and Work with an SEO Agency: A Technical Audit & On-Page Optimization Checklist

How to Vet and Work with an SEO Agency: A Technical Audit & On-Page Optimization Checklist

You’ve hired an SEO agency—or you’re about to. The pitch deck was slick, the case studies looked impressive, and the account manager promised “growth.” But six months in, organic traffic hasn’t budged, or worse, you got a manual action notice from Google. The problem isn’t SEO itself; it’s that many agencies treat optimization as a black box of keyword stuffing and link buying. A serious SEO partnership—one that actually moves metrics—starts with a rigorous technical audit and a transparent, risk-aware on-page strategy. This checklist walks you through exactly what to demand from your agency, what red flags to watch for, and how to brief campaigns that survive algorithm updates.

What a Real Technical SEO Audit Covers (and What to Ignore)

A technical SEO audit is not a one-page PDF that says “fix your title tags.” It’s a deep, crawl-level investigation of how search engines discover, render, and index your site. The agency should start by analyzing your crawl budget—how many pages Googlebot visits per day and whether it’s wasting those visits on thin or duplicate content. If your site has 50,000 URLs but only 2,000 get indexed, the problem isn’t content; it’s crawl efficiency.

The audit must address three core layers:

LayerWhat It ChecksCommon Failure Point
Crawlabilityrobots.txt directives, XML sitemap coverage, internal link depthBlocking critical pages via robots.txt or missing sitemaps
IndexabilityCanonical tags, noindex tags, redirect chains, duplicate contentMultiple URLs for the same product page without a canonical
PerformanceCore Web Vitals (LCP, CLS, INP), server response time, mobile responsivenessSlow Largest Contentful Paint due to unoptimized images or render-blocking scripts

Avoid agencies that skip crawl budget analysis or claim they can “fix everything in two weeks.” A thorough audit takes time for a mid-size site and requires access to Google Search Console, server logs (if possible), and a crawl tool like Screaming Frog or Sitebulb. If they hand you a report without showing you the crawl log data, ask why.

The Crawl Budget Trap: Why Your Sitemap Isn’t Enough

Most site owners assume that submitting an XML sitemap guarantees indexing. It doesn’t. A sitemap is a suggestion, not a command. Googlebot has a limited crawl budget per site, determined by your site’s authority and the rate of content change. If your sitemap includes 10,000 URLs but only 200 are high-quality, the bot may spend its budget on the wrong pages.

Here’s the practical workflow the agency should follow:

  1. Audit your current crawl allocation in Google Search Console under “Crawl stats.” Look at the average pages crawled per day versus total pages on your site.
  2. Identify crawl waste: pages with noindex tags, thin affiliate content, paginated archives, or session-based URLs.
  3. Prune or consolidate those URLs via robots.txt disallow or by removing them from the sitemap.
  4. Prioritize high-value pages (product pages, cornerstone articles, service landing pages) in the sitemap with proper `<priority>` and `<changefreq>` hints.
  5. Monitor index coverage weekly for the first month to see if the ratio improves.
A common mistake agencies make is adding every blog post to the sitemap without checking for duplicate content. If you have two URLs that serve the same article (e.g., with and without a trailing slash), you may be wasting crawl budget. The canonical tag should point to the preferred version, and the other should redirect via 301.

On-Page Optimization: Beyond Keyword Density

On-page optimization has evolved from stuffing “best running shoes” into a paragraph five times to matching search intent and structuring content for featured snippets. When you brief an agency, demand a keyword research process that includes intent mapping—not just a list of high-volume terms.

For each target keyword, the agency should answer:

  • Is the intent informational (users want a guide), navigational (they want a specific site), commercial (they’re comparing options), or transactional (they’re ready to buy)?
  • What content format dominates the current top 10 results? (Listicles, product pages, videos, long-form guides?)
  • What is the average word count, heading structure, and internal link density of those top pages?
Only after this analysis should the agency create a content brief. The actual on-page elements to optimize include:

ElementBest PracticeRed Flag
Title tagUnique per page, includes primary keyword near front, under 60 charactersSame title on multiple pages or keyword-stuffed
Meta descriptionCompelling summary with keyword, under 160 characters, includes call-to-actionMissing, duplicated, or auto-generated
H1 headingOne per page, matches the page’s main topic, not identical to titleMultiple H1s or irrelevant headings
Image alt textDescriptive, includes keyword where natural, under 125 charactersKeyword-stuffed or left blank
Internal links3–5 relevant links per 500 words, using descriptive anchor textLinks to irrelevant pages or exact-match anchors on every link

Agencies that promise first-page ranking by repeating keywords are either lying or using black-hat tactics. Real on-page optimization takes iterative testing: you write, publish, monitor CTR and dwell time, then adjust headings and meta descriptions based on performance.

Link Building: The Risk-Aware Brief

Link building remains the most dangerous part of SEO—and the most misunderstood. A good agency will never sell you “1,000 backlinks for $500” or promise “instant DA boost.” Instead, they’ll start with a backlink profile audit to see what you already have. Tools like Ahrefs or Majestic can reveal your current Domain Authority and Trust Flow scores, but more importantly, they show the ratio of toxic to healthy links.

Your briefing to the agency should include:

  • A clear definition of target sites: industry-specific blogs, .edu or .gov resources (if relevant), local business directories, and reputable news outlets.
  • A prohibition on: private blog networks (PBNs), automated directory submissions, link exchanges, and paid links disguised as “sponsored content” without proper `rel="sponsored"` tags.
  • A measurement framework: track new referring domains per month, domain rating of acquired links, and the ratio of dofollow to nofollow links. Ignore total backlink count—it’s vanity.
The agency should provide a monthly report showing which sites they’ve outreached to, the response rate, and the links successfully placed. If they can’t show you the outreach emails (redacted for privacy), they’re likely buying links from a marketplace.

What can go wrong? If the agency buys low-quality links from a PBN, Google’s algorithm may detect the pattern and issue a manual action. Recovering from a link penalty can take months and may require disavowing domains. Always ask for a link quality guarantee in your contract—if a link gets flagged, the agency should remove it or replace it at no cost.

Core Web Vitals: The Performance Layer Most Agencies Ignore

Core Web Vitals—specifically Largest Contentful Paint (LCP), Cumulative Layout Shift (CLS), and Interaction to Next Paint (INP)—are now ranking signals. Yet many SEO agencies treat them as a developer problem. They’re not. Poor LCP (over 2.5 seconds) or high CLS (over 0.1) directly hurt user experience and rankings, especially on mobile.

When evaluating an agency’s performance approach, ask:

  • Do they run Lighthouse tests on your top 10 landing pages monthly?
  • Can they interpret the lab data (from Lighthouse) versus field data (from Chrome User Experience Report)?
  • Do they provide actionable fixes, like “compress hero images to under 100 KB” or “lazy-load below-the-fold videos”?
A good agency will create a performance improvement roadmap that prioritizes the low-effort, high-impact fixes first. For example, removing render-blocking JavaScript can improve LCP noticeably with a single code change. They should also monitor your INP score, which measures how quickly your page responds to user interactions like clicks and taps. A high INP (over 200 ms) can frustrate users and may hurt conversion rates.

If the agency’s only performance recommendation is “upgrade your hosting,” they’re not doing real technical SEO. Server response time matters, but so do image optimization, font loading, and script deferral.

How to Brief a Campaign That Survives Algorithm Updates

The best SEO campaigns are built for longevity, not quick wins. When you brief your agency, frame the goals around sustainable growth rather than specific ranking positions. For example:

  • “Increase organic traffic from high-intent keywords by 25% over six months” (instead of “rank #1 for ‘best SEO tool’”).
  • “Reduce crawl waste by 40% and improve index coverage to 90% of valuable pages.”
  • “Achieve a Core Web Vitals pass rate of 80% on mobile traffic within three months.”
Include a risk clause: if the agency uses tactics that result in a manual action or ranking drop from an algorithm update (like a Helpful Content Update), they must provide a remediation plan within two weeks at no extra cost. This protects you from agencies that chase algorithmic loopholes.

Finally, demand transparency in reporting. The monthly report should show:

  • Crawl budget usage (pages crawled per day vs. total indexable pages)
  • Core Web Vitals pass/fail rates per page group
  • Backlink acquisition by domain rating and trust flow
  • On-page optimization completion rate (how many pages have been updated with proper title tags, meta descriptions, etc.)
  • Organic traffic by landing page and keyword intent
If the report only shows “traffic up 10%” without context, push for the underlying data. Real SEO growth is measurable, explainable, and repeatable—not a black box of promises.

Final Checklist: What to Demand from Your Agency

Use this as your go-to when evaluating proposals or reviewing ongoing work:

  • Technical audit covers crawl budget, robots.txt, XML sitemap, canonical tags, duplicate content, and Core Web Vitals.
  • On-page strategy includes intent mapping, keyword research with volume and difficulty, and a content brief template.
  • Link building starts with a backlink profile audit, prohibits black-hat tactics, and provides outreach transparency.
  • Performance monitoring includes monthly Lighthouse tests and field data from Chrome UX Report.
  • Reporting includes crawl stats, Core Web Vitals pass rates, backlink quality metrics, and organic traffic by intent.
  • Risk management includes a clause for manual action remediation and algorithm update response.
A serious SEO agency treats your site like a technical system, not a marketing experiment. They audit first, optimize second, and measure third. If they skip the audit or gloss over crawl budget, keep looking. Your site deserves partners who understand that SEO is engineering, not magic.

Wendy Garza

Wendy Garza

Technical SEO Specialist

Elena focuses on site architecture, crawl efficiency, and structured data. She breaks down complex technical issues into clear, actionable steps.

Reader Comments (0)

Leave a comment