1. Define the Scope: From Crawl Budget to Core Web Vitals

Engaging a top-tier SEO agency is rarely about buying a one-time fix; it is about commissioning a systematic diagnostic that aligns your site’s infrastructure with search engine crawling, rendering, and ranking logic. A common failure point in agency-client relationships is a lack of clarity in the brief. This article provides a practical, risk-aware checklist for briefing an SEO agency on technical audits, site performance, and Core Web Vitals—with special attention to the underlying network and API layers that often go overlooked.

1. Define the Scope: From Crawl Budget to Core Web Vitals

Before any audit begins, you must specify which technical layers the agency will examine. A comprehensive technical SEO audit should cover, at minimum:

  • Crawl budget and crawlability: How Googlebot discovers and allocates resources to your URLs. Issues such as infinite crawl spaces, soft 404s, or excessive parameterized URLs can waste crawl budget.
  • Core Web Vitals: Specifically Largest Contentful Paint (LCP), Cumulative Layout Shift (CLS), First Input Delay (FID), and the newer Interaction to Next Paint (INP). The agency should measure these against the 75th percentile of real-user data from the Chrome User Experience Report (CrUX).
  • Infrastructure-level performance: Server response times, Time to First Byte (TTFB), CDN configuration, and how your hosting environment interacts with Google’s network APIs.
Action item for your brief: Explicitly state that the audit must include a crawl budget analysis using Google Search Console’s crawl stats report and a full Core Web Vitals diagnosis using both lab data (Lighthouse) and field data (CrUX). If your site runs on Google Cloud, request a review of Cloud CDN, Cloud Load Balancing, and the Cloud Network API tier (e.g., Premium vs. Standard Tier).

2. The XML Sitemap and robots.txt: The Gatekeepers of Discovery

Many audits treat XML sitemaps and robots.txt as afterthoughts, but they are the first signals an agency should validate. A poorly configured sitemap can cause index bloat, while an overly restrictive robots.txt can block critical resources.

Checklist for the agency:

ComponentWhat to verifyCommon pitfalls
XML sitemapContains only canonical, indexable URLs; excludes paginated, filtered, or noindex pages; includes lastmod dates that reflect true content changesIncluding paginated or session-based URLs; stale lastmod values; sitemap exceeds 50,000 URLs or 50MB
robots.txtAllows crawling of CSS, JS, and image files needed for rendering; disallows only non-public or infinite-scroll paths; is not used to hide thin contentBlocking Google’s rendering engine (Googlebot Image, Googlebot News); using `Disallow: /` on a new site; including private paths that still leak in sitemaps
Canonical tagsSelf-referencing on canonical pages; correctly pointing to the preferred version on duplicates; no conflicting signals with hreflang or noindexMissing canonical on syndicated content; canonical pointing to a 301-redirected URL; using canonical on paginated series without the “view-all” page

Risk callout: Using `robots.txt` to block thin content while still including those URLs in the sitemap sends contradictory signals and may increase the risk of manual actions. The agency should flag any such inconsistencies.

3. Duplicate Content and Canonicalization: The Silent Performance Drain

Duplicate content is not a penalty in the algorithmic sense, but it dilutes link equity and confuses Google’s choice of the canonical version. An agency should perform a site-wide duplicate content analysis using a tool like Screaming Frog or DeepCrawl, focusing on:

  • URL parameter duplication (e.g., `?sort=price`, `?session_id=abc`)
  • HTTP vs. HTTPS and www vs. non-www (ensure 301 redirects, not 302)
  • Product variation pages (e.g., different colors of the same item)
  • Pagination and infinite scroll (implement `rel="next"` and `rel="prev"` or use a “view-all” strategy)
How to brief the agency: Ask for a report that quantifies the percentage of your indexable URLs that are duplicate or near-duplicate. Require a canonical tag audit that checks for (a) missing canonicals on non-canonical pages, (b) canonicals pointing to 4xx or 5xx URLs, and (c) contradictory signals (canonical + noindex on the same page).

4. On-Page Optimization and Intent Mapping: Beyond Keywords

On-page optimization has evolved from stuffing keywords into title tags to aligning content with search intent. An agency should map your existing pages to one of four intent categories: informational, navigational, commercial, or transactional. This intent mapping directly informs content strategy and internal linking.

Action items for the agency brief:

  1. Keyword research with intent classification: Request a keyword list that includes search volume, difficulty, and intent label. Avoid agencies that present a flat list of high-volume keywords without explaining how they map to your conversion funnel.
  2. Content gap analysis: Using tools like Ahrefs or Semrush, the agency should identify topics your competitors rank for that you do not. The output should be a content strategy calendar, not just a list of “we need more blog posts.”
  3. Internal linking audit: The agency should evaluate whether your internal link structure distributes PageRank effectively. A common issue is orphan pages (no internal links pointing to them) or over-optimized anchor text.
Risk callout: Beware of agencies that promise quick results by targeting high-volume keywords without considering intent. For example, a “buy now” page targeting an informational query like “how to choose a CRM” will likely have high bounce rates and low conversion. Intent mismatch is one of the most common—and most preventable—on-page failures.

5. Link Building and Backlink Profile: The Quality vs. Quantity Trap

Link building remains a high-risk, high-reward activity. A reputable agency will typically avoid guaranteeing a specific number of backlinks per month or promising a fixed Domain Authority (DA) or Trust Flow (TF) increase. Instead, they should present a link acquisition strategy based on relevance, authority, and editorial merit.

What to include in your brief:

RequirementWhat it meansRed flags
Backlink profile auditAnalyze existing backlinks for toxic or spammy domains using tools like Majestic or LinkResearchToolsAgency refuses to show the raw data; claims all backlinks are “high quality” without evidence
Outreach strategyTarget relevant industry publications, resource pages, and broken link opportunitiesAgency mentions “private blog networks” (PBNs), paid links, or automated directory submissions
Disavow filePrepare a disavow file for links that are clearly manipulative or from penalized domainsAgency says “we never need to disavow” or “Google ignores bad links anyway” (both are incorrect)
Link relevanceLinks should come from sites topically related to your nicheAgency builds links from unrelated domains (e.g., a pet food site linking to a B2B SaaS product)

Risk callout: Black-hat link building—such as PBNs, link exchanges, or automated comment spam—can trigger a manual penalty or algorithmic demotion (e.g., Penguin). A single bad link campaign can potentially harm your site's performance, though the severity and speed of impact vary widely. Always ask the agency to document their link acquisition process and provide examples of past outreach emails.

6. Core Web Vitals and Site Performance: The Network API Factor

Core Web Vitals are not just a front-end concern; they are deeply influenced by server configuration, CDN choice, and the network path between the user and your origin. For sites hosted on Google Cloud, the Cloud Network API tier (Premium vs. Standard) can significantly affect LCP and TTFB.

What the agency should evaluate:

  • LCP (Largest Contentful Paint): The primary cause of slow LCP is slow server response times (TTFB), render-blocking resources, or large images. The agency should measure TTFB from multiple geographic locations and recommend CDN optimization (e.g., using Cloud CDN with Google Cloud Armor).
  • CLS (Cumulative Layout Shift): Caused by images or ads without explicit dimensions, web fonts loading asynchronously, or dynamic content injected above the fold. The audit should include a list of all elements causing layout shifts.
  • INP (Interaction to Next Page): A newer metric that measures responsiveness to user interactions (clicks, taps, keyboard events). Poor INP often stems from long JavaScript execution times or heavy third-party scripts.
Action item for your brief: Ask the agency to provide a performance budget—a set of thresholds for LCP (<2.5s), FID (<100ms), CLS (<0.1), and TTFB (<800ms). If your site uses Google Cloud, request a specific review of whether you are on Premium Tier (which uses Google’s private network for most of the path) or Standard Tier (which relies on public internet). This configuration can notably improve TTFB for users far from your origin.

7. The Technical SEO Audit Report: What to Expect

A proper audit report should not be a 200-page PDF that gathers dust. It should be an actionable document with prioritized issues, estimated effort, and expected impact. Use the following checklist to evaluate the agency’s deliverable:

SectionMust includeNice to have
Executive summaryTop 5 critical issues; estimated impact on organic trafficTimeline for fixes; owner assignment
Crawl diagnosticsBroken links, redirect chains, 4xx/5xx errorsCrawl budget waste estimate
Indexation analysisPages indexed vs. pages in sitemap; noindex vs. canonical conflictsIndex bloat percentage
Core Web VitalsLCP, CLS, INP from CrUX; lab vs. field data comparisonPerformance budget recommendations
Backlink profileToxic link count; domain authority distributionCompetitor backlink gap analysis
On-page and contentMissing title tags, meta descriptions, H1s; keyword cannibalizationIntent mapping for top 50 pages

Risk callout: Some agencies will present an audit that lists hundreds of minor issues (e.g., missing alt text on 50 images) while ignoring the one critical issue (e.g., a misconfigured robots.txt blocking all JavaScript). Insist on a severity rating (Critical, High, Medium, Low) and a clear explanation of why each issue matters.

8. Summary and Next Steps

Briefing an SEO agency for technical SEO and site performance is not about transferring responsibility; it is about establishing a shared framework for diagnosis and improvement. By specifying the scope (crawl budget, Core Web Vitals, network API), demanding evidence-based deliverables (CrUX data, crawl logs, performance budgets), and maintaining healthy skepticism toward guarantees, you set the foundation for a productive partnership.

Final checklist for your brief:

  • Require a crawl budget analysis using Search Console and server logs.
  • Request Core Web Vitals data from CrUX (field data), not just Lighthouse (lab data).
  • Audit XML sitemap and robots.txt for contradictory signals.
  • Demand a duplicate content and canonicalization report.
  • Insist on intent mapping for all target keywords.
  • Verify link building strategy excludes black-hat tactics (PBNs, paid links, automated outreach).
  • Evaluate performance budget and CDN/network tier configuration.
  • Review the audit report for severity ratings and actionable recommendations.
For further reading, see our guides on technical SEO and site health and Core Web Vitals optimization strategies.

Tyler Alvarado

Tyler Alvarado

Analytics and Reporting Reviewer

Jordan audits tracking setups and interprets SEO data to inform strategy. He focuses on actionable insights from analytics platforms.

Reader Comments (0)

Leave a comment