The Technical SEO Agency Checklist: How to Vet, Brief, and Audit for Real Results

The Technical SEO Agency Checklist: How to Vet, Brief, and Audit for Real Results

You are about to hire an SEO agency—or perhaps you are the agency being hired. Either way, the single most common mistake in the industry is treating SEO as a black box: "We'll do SEO, trust us." That approach fails because SEO is not a monolithic service. It is a layered discipline where technical foundations, on-page signals, and off-page authority must work in concert. Without a structured checklist to evaluate and brief an agency, you risk paying for vanity metrics, black-hat shortcuts, or a one-size-fits-all strategy that ignores your site's specific crawl budget, Core Web Vitals, and content gaps.

This guide provides a step-by-step framework for defining, auditing, and managing SEO agency work. It covers what to demand in a technical SEO audit, how to brief a keyword research and content strategy campaign, and how to evaluate link building without falling for dangerous promises. The tone is direct and skeptical—because the SEO industry is full of sales pitches dressed as expertise.

Step 1: Define the Scope of Technical SEO Audits and Site Health

Before any optimization begins, your agency must prove they understand your site's technical foundation. A proper technical SEO audit is not a list of "fix title tags" or "add alt text." It is a systematic examination of how search engines discover, crawl, index, and render your pages. The audit should cover at least these five areas:

  1. Crawl budget and crawlability – Does your site waste Googlebot's limited crawl budget on infinite parameter URLs, thin pages, or low-value archives? The agency should analyze server logs (not just crawl reports) to identify crawl inefficiencies. If your site has 50,000 URLs but only 10,000 are indexed, the problem is likely crawl budget allocation, not content quality.
  2. XML sitemap and robots.txt – Your sitemap.xml must list only canonical, indexable pages. The robots.txt must not accidentally block CSS, JS, or critical pages. A common error: using `Disallow: /` in robots.txt during a site migration and forgetting to revert it, which takes the entire site out of Google's index for weeks.
  3. Core Web Vitals (LCP, CLS, FID/INP) – These are not "nice to have." They are ranking factors for the Top Stories carousel and increasingly for all organic results. The audit must measure real-user data from the Chrome User Experience Report (CrUX), not just lab data from Lighthouse. Poor LCP (largest contentful paint) often stems from oversized hero images or render-blocking JavaScript. Poor CLS (cumulative layout shift) comes from ads or images without explicit dimensions.
  4. Canonical tags and duplicate content – Every page that should be indexed must have a self-referencing canonical tag. E-commerce sites with faceted navigation (e.g., `?color=red&size=m`) are notorious for generating thousands of near-identical URLs. The audit should identify whether the agency plans to use canonical tags, noindex tags, or URL parameter handling in Google Search Console to manage duplicates.
  5. Redirect chains and broken links – A single 301 redirect is fine. A chain of five 301s (A → B → C → D → E) wastes crawl budget and dilutes link equity. The audit must map all redirect chains longer than three hops and recommend flattening them.
Table 1: Minimum Deliverables in a Technical SEO Audit

Audit ComponentWhat to ExpectRed Flag
Crawl budget analysisServer log analysis showing crawl frequency per URL patternOnly a "crawl report" from Screaming Frog without log data
Core Web VitalsCrUX data broken down by URL group (e.g., product pages vs. blog)Only Lighthouse scores on a single test URL
XML sitemap reviewList of included URLs vs. indexed URLs with discrepancy explanationNo mention of sitemap size limits (50,000 URLs, 50 MB)
robots.txt checkScreenshot of the file with annotations on blocked resourcesNo check for accidental `Disallow: /`
CanonicalizationReport of pages with missing, conflicting, or cross-domain canonicalsNo mention of hreflang tags if site is multilingual

Step 2: Brief the On-Page Optimization and Keyword Research Campaign

Once the technical audit is complete, the agency should move to on-page optimization. This is where most agencies fail because they treat it as a mechanical task: "Insert keyword X into H1, body text, and meta description." That approach ignores search intent and content quality. Your brief must demand intent mapping.

How to brief keyword research:

  • Start with a seed list of 20–30 terms that represent your core products or services. For each term, the agency should classify it by intent: informational (searcher wants to learn), navigational (wants to find a specific site), commercial (wants to compare options), or transactional (wants to buy). A law firm ranking for "personal injury lawyer" (transactional) needs a different page than one ranking for "how to file a personal injury claim" (informational).
  • Require the agency to produce a content gap analysis. This compares your current ranking pages to the top 10 results for your target keywords. If every top result has a video, a FAQ section, or a downloadable PDF, your page needs the same. If your page is a thin 300-word paragraph while competitors have 2,000-word guides, you cannot win without a content rewrite.
  • For e-commerce clients, demand intent mapping at the category and product level. A category page for "men's running shoes" should not target the same keywords as a blog post titled "best running shoes for marathons." The category page must prioritize transactional and commercial intent; the blog post targets informational intent.
On-page optimization checklist:
  • Title tags: unique, under 60 characters, include primary keyword near the front, avoid keyword stuffing.
  • Meta descriptions: persuasive, under 160 characters, include call-to-action.
  • H1 tags: one per page, matches the title tag in theme but not identical.
  • Header hierarchy: H2s for main sections, H3s for subsections. No skipping levels.
  • Image optimization: descriptive file names, alt text with target keyword where natural, WebP format, explicit width/height attributes to prevent CLS.
  • Internal linking: at least 3–5 contextual links per page pointing to relevant pillar content. No "click here" anchor text.
  • Structured data: JSON-LD for BreadcrumbList, Product, FAQ, or Article schema as appropriate.
Risk alert: If an agency promises to "optimize all 500 product pages in one week," they are likely using automation tools that generate thin, duplicate content. Google's helpful content update penalizes pages that are "written" primarily for ranking, not for users. Demand manual or semi-automated work with human review.

Step 3: Evaluate the Link Building Strategy—and the Risks

Link building is the most dangerous part of SEO because it is the hardest to do well and the easiest to do badly. Black-hat tactics—private blog networks (PBNs), paid links, link exchanges, automated outreach—can work in the short term but often lead to manual penalties or algorithmic devaluations. Your agency must present a link building strategy that is transparent, scalable, and risk-aware.

What to demand from a link building brief:

  • Backlink profile audit first. Before building new links, the agency must analyze your existing backlink profile using tools like Ahrefs, Majestic, or Moz. They should identify toxic links (low Trust Flow, high spam score, irrelevant domains) and recommend disavowing them if there is a manual action or a clear pattern of spam.
  • Link acquisition methods. Legitimate methods include: guest posting on relevant industry sites (with editorial oversight), broken link building (finding dead pages on authoritative sites and offering your content as a replacement), resource page link building (getting listed on curated resource lists), and digital PR (creating newsworthy data or stories that earn natural mentions). Avoid agencies that rely heavily on "niche edits" (paying to insert links into existing articles) without disclosing the source.
  • Domain Authority and Trust Flow targets. No agency can guarantee a specific DA increase because DA is a composite metric that depends on the entire link profile of the linking domain. However, they should set targets for the number of referring domains (e.g., 10–20 new domains per month) and the average Trust Flow of those domains (e.g., >20). Be skeptical of agencies that promise "50 high-DA backlinks in 30 days"—that often means PBNs or expired domain redirects.
Table 2: Link Building Methods—Risk vs. Reward

MethodRisk LevelTypical CostLong-Term Value
Guest posting on editorial sitesLow$200–$1,000 per postHigh (if site is authoritative and relevant)
Broken link buildingLowTime-intensive, low costHigh (earned, not paid)
Digital PR (data-driven campaigns)Low$2,000–$10,000 per campaignVery high (natural links, brand awareness)
Niche edits (paid insertion)Medium$100–$500 per linkMedium (depends on site quality)
PBN linksHigh$50–$200 per linkLow (penalty risk, devaluation over time)
Link exchanges (reciprocal)MediumFree (but time)Low (Google may discount reciprocal links)
Automated directory submissionsHigh$50–$200Very low (ignored or penalized)

Black-hat warning signs:

  • "We have a network of 500 high-authority sites." (This is a PBN.)
  • "We can get you a link from Wikipedia." (Wikipedia links are nofollow and almost impossible to earn for commercial pages.)
  • "We guarantee a top 3 ranking." (No ethical agency can guarantee rankings because Google's algorithm changes constantly.)
  • "We use proprietary automated link building software." (Automated outreach is spam and often results in low-quality links.)

Step 4: Monitor Core Web Vitals and Site Performance Post-Launch

Technical SEO is not a one-time project. After the initial audit and fixes, the agency must monitor Core Web Vitals and other performance metrics continuously. This is where the "site health" aspect of your SEO services comes into play.

What to track monthly:

  • LCP, CLS, FID/INP – Use Google Search Console's Core Web Vitals report. If the percentage of good URLs drops below 75%, the agency should investigate. Common causes: new JavaScript libraries, third-party scripts (ads, analytics, chatbots), or large images uploaded by content teams.
  • Crawl stats – In Google Search Console, monitor total crawl requests, average response time, and crawl errors. A sudden spike in 404 errors or a drop in crawl rate may indicate a server issue or a misconfigured robots.txt.
  • Index coverage – Track the number of indexed pages versus submitted pages. If indexed pages drop sharply after a site update, the agency should check for canonicalization errors or accidental noindex tags.
  • Page speed – While Core Web Vitals are the ranking factor, page speed (measured by Time to First Byte or Fully Loaded Time) affects user experience and conversion rates. The agency should provide a monthly page speed report for the top 20 traffic-driving pages.
Table 3: Core Web Vitals Thresholds (Based on Google's Guidelines)

MetricGoodNeeds ImprovementPoor
LCP≤ 2.5 seconds2.5–4.0 seconds> 4.0 seconds
FID≤ 100 milliseconds100–300 milliseconds> 300 milliseconds
INP (replacing FID in March 2024)≤ 200 milliseconds200–500 milliseconds> 500 milliseconds
CLS≤ 0.10.1–0.25> 0.25

Risk alert: If the agency proposes "lazy loading all images" to improve LCP, ensure they implement it correctly. Lazy loading can actually worsen LCP if the hero image is lazy-loaded. The hero image should load with `loading="eager"` or as part of the initial HTML response.

Step 5: Establish a Reporting Cadence and Accountability Framework

The final step is not technical—it is about governance. Without clear reporting, you cannot tell if the agency is delivering value. Your brief must specify:

  • Monthly reports that include: organic traffic trends (by channel and device), keyword rankings for target terms (with movement flags), backlink acquisition (new referring domains, lost links, Trust Flow changes), Core Web Vitals progress, and index coverage changes.
  • Quarterly strategic reviews that go beyond the numbers. The agency should present a competitive landscape analysis (how your top competitors are gaining or losing ground), a content gap update (new opportunities based on search trends), and a technical roadmap for the next quarter.
  • Penalty and recovery protocol. The agency should have a documented process for handling Google manual actions (e.g., unnatural links) or algorithmic updates (e.g., helpful content update). This includes: how they detect the issue, how they communicate it to you, and the timeline for remediation.
What to avoid in reporting:
  • Vanity metrics like "total backlinks" without quality filters.
  • Rankings tracked on tools that use "estimated traffic" (Google does not share search volume data).
  • Traffic reports that do not segment organic vs. paid vs. direct.
  • Reports that show "impressions" as a success metric without corresponding clicks or CTR.

Summary: The Action Items for Your SEO Agency Brief

  1. Demand a technical SEO audit that covers crawl budget, Core Web Vitals, canonicalization, robots.txt, and XML sitemap. Reject any audit that relies solely on a crawler tool without server log analysis.
  2. Brief keyword research with explicit intent mapping. Require a content gap analysis that compares your pages to top-ranking competitors. Avoid bulk "optimization" that relies on automation.
  3. Evaluate link building strategies for risk. Reject PBNs, automated outreach, and guaranteed rankings. Prefer earned links through digital PR, broken link building, and editorial guest posting.
  4. Monitor Core Web Vitals and site health monthly. Set thresholds for good/poor performance and escalate deviations immediately.
  5. Establish a reporting cadence with monthly metrics and quarterly strategic reviews. Avoid vanity metrics and demand transparency on methodology.
By following this checklist, you move from being a passive client to an informed partner. The agency works for you, not the other way around. If they resist providing log files, refusing to share their link building sources, or promising rankings without technical groundwork, find another agency. The cost of a bad SEO contract is not just the monthly retainer—it is the lost opportunity and potential penalty that could take months to recover from.

Tyler Alvarado

Tyler Alvarado

Analytics and Reporting Reviewer

Jordan audits tracking setups and interprets SEO data to inform strategy. He focuses on actionable insights from analytics platforms.

Reader Comments (0)

Leave a comment