Engaging a top-tier SEO agency is rarely about buying a one-time fix; it is about commissioning a systematic diagnostic that aligns your site’s infrastructure with search engine crawling, rendering, and ranking logic. A common failure point in agency-client relationships is a lack of clarity in the brief. This article provides a practical, risk-aware checklist for briefing an SEO agency on technical audits, site performance, and Core Web Vitals—with special attention to the underlying network and API layers that often go overlooked.
1. Define the Scope: From Crawl Budget to Core Web Vitals
Before any audit begins, you must specify which technical layers the agency will examine. A comprehensive technical SEO audit should cover, at minimum:
- Crawl budget and crawlability: How Googlebot discovers and allocates resources to your URLs. Issues such as infinite crawl spaces, soft 404s, or excessive parameterized URLs can waste crawl budget.
- Core Web Vitals: Specifically Largest Contentful Paint (LCP), Cumulative Layout Shift (CLS), First Input Delay (FID), and the newer Interaction to Next Paint (INP). The agency should measure these against the 75th percentile of real-user data from the Chrome User Experience Report (CrUX).
- Infrastructure-level performance: Server response times, Time to First Byte (TTFB), CDN configuration, and how your hosting environment interacts with Google’s network APIs.
2. The XML Sitemap and robots.txt: The Gatekeepers of Discovery
Many audits treat XML sitemaps and robots.txt as afterthoughts, but they are the first signals an agency should validate. A poorly configured sitemap can cause index bloat, while an overly restrictive robots.txt can block critical resources.
Checklist for the agency:
| Component | What to verify | Common pitfalls |
|---|---|---|
| XML sitemap | Contains only canonical, indexable URLs; excludes paginated, filtered, or noindex pages; includes lastmod dates that reflect true content changes | Including paginated or session-based URLs; stale lastmod values; sitemap exceeds 50,000 URLs or 50MB |
| robots.txt | Allows crawling of CSS, JS, and image files needed for rendering; disallows only non-public or infinite-scroll paths; is not used to hide thin content | Blocking Google’s rendering engine (Googlebot Image, Googlebot News); using `Disallow: /` on a new site; including private paths that still leak in sitemaps |
| Canonical tags | Self-referencing on canonical pages; correctly pointing to the preferred version on duplicates; no conflicting signals with hreflang or noindex | Missing canonical on syndicated content; canonical pointing to a 301-redirected URL; using canonical on paginated series without the “view-all” page |
Risk callout: Using `robots.txt` to block thin content while still including those URLs in the sitemap sends contradictory signals and may increase the risk of manual actions. The agency should flag any such inconsistencies.
3. Duplicate Content and Canonicalization: The Silent Performance Drain
Duplicate content is not a penalty in the algorithmic sense, but it dilutes link equity and confuses Google’s choice of the canonical version. An agency should perform a site-wide duplicate content analysis using a tool like Screaming Frog or DeepCrawl, focusing on:
- URL parameter duplication (e.g., `?sort=price`, `?session_id=abc`)
- HTTP vs. HTTPS and www vs. non-www (ensure 301 redirects, not 302)
- Product variation pages (e.g., different colors of the same item)
- Pagination and infinite scroll (implement `rel="next"` and `rel="prev"` or use a “view-all” strategy)

4. On-Page Optimization and Intent Mapping: Beyond Keywords
On-page optimization has evolved from stuffing keywords into title tags to aligning content with search intent. An agency should map your existing pages to one of four intent categories: informational, navigational, commercial, or transactional. This intent mapping directly informs content strategy and internal linking.
Action items for the agency brief:
- Keyword research with intent classification: Request a keyword list that includes search volume, difficulty, and intent label. Avoid agencies that present a flat list of high-volume keywords without explaining how they map to your conversion funnel.
- Content gap analysis: Using tools like Ahrefs or Semrush, the agency should identify topics your competitors rank for that you do not. The output should be a content strategy calendar, not just a list of “we need more blog posts.”
- Internal linking audit: The agency should evaluate whether your internal link structure distributes PageRank effectively. A common issue is orphan pages (no internal links pointing to them) or over-optimized anchor text.
5. Link Building and Backlink Profile: The Quality vs. Quantity Trap
Link building remains a high-risk, high-reward activity. A reputable agency will typically avoid guaranteeing a specific number of backlinks per month or promising a fixed Domain Authority (DA) or Trust Flow (TF) increase. Instead, they should present a link acquisition strategy based on relevance, authority, and editorial merit.
What to include in your brief:
| Requirement | What it means | Red flags |
|---|---|---|
| Backlink profile audit | Analyze existing backlinks for toxic or spammy domains using tools like Majestic or LinkResearchTools | Agency refuses to show the raw data; claims all backlinks are “high quality” without evidence |
| Outreach strategy | Target relevant industry publications, resource pages, and broken link opportunities | Agency mentions “private blog networks” (PBNs), paid links, or automated directory submissions |
| Disavow file | Prepare a disavow file for links that are clearly manipulative or from penalized domains | Agency says “we never need to disavow” or “Google ignores bad links anyway” (both are incorrect) |
| Link relevance | Links should come from sites topically related to your niche | Agency builds links from unrelated domains (e.g., a pet food site linking to a B2B SaaS product) |
Risk callout: Black-hat link building—such as PBNs, link exchanges, or automated comment spam—can trigger a manual penalty or algorithmic demotion (e.g., Penguin). A single bad link campaign can potentially harm your site's performance, though the severity and speed of impact vary widely. Always ask the agency to document their link acquisition process and provide examples of past outreach emails.
6. Core Web Vitals and Site Performance: The Network API Factor
Core Web Vitals are not just a front-end concern; they are deeply influenced by server configuration, CDN choice, and the network path between the user and your origin. For sites hosted on Google Cloud, the Cloud Network API tier (Premium vs. Standard) can significantly affect LCP and TTFB.

What the agency should evaluate:
- LCP (Largest Contentful Paint): The primary cause of slow LCP is slow server response times (TTFB), render-blocking resources, or large images. The agency should measure TTFB from multiple geographic locations and recommend CDN optimization (e.g., using Cloud CDN with Google Cloud Armor).
- CLS (Cumulative Layout Shift): Caused by images or ads without explicit dimensions, web fonts loading asynchronously, or dynamic content injected above the fold. The audit should include a list of all elements causing layout shifts.
- INP (Interaction to Next Page): A newer metric that measures responsiveness to user interactions (clicks, taps, keyboard events). Poor INP often stems from long JavaScript execution times or heavy third-party scripts.
7. The Technical SEO Audit Report: What to Expect
A proper audit report should not be a 200-page PDF that gathers dust. It should be an actionable document with prioritized issues, estimated effort, and expected impact. Use the following checklist to evaluate the agency’s deliverable:
| Section | Must include | Nice to have |
|---|---|---|
| Executive summary | Top 5 critical issues; estimated impact on organic traffic | Timeline for fixes; owner assignment |
| Crawl diagnostics | Broken links, redirect chains, 4xx/5xx errors | Crawl budget waste estimate |
| Indexation analysis | Pages indexed vs. pages in sitemap; noindex vs. canonical conflicts | Index bloat percentage |
| Core Web Vitals | LCP, CLS, INP from CrUX; lab vs. field data comparison | Performance budget recommendations |
| Backlink profile | Toxic link count; domain authority distribution | Competitor backlink gap analysis |
| On-page and content | Missing title tags, meta descriptions, H1s; keyword cannibalization | Intent mapping for top 50 pages |
Risk callout: Some agencies will present an audit that lists hundreds of minor issues (e.g., missing alt text on 50 images) while ignoring the one critical issue (e.g., a misconfigured robots.txt blocking all JavaScript). Insist on a severity rating (Critical, High, Medium, Low) and a clear explanation of why each issue matters.
8. Summary and Next Steps
Briefing an SEO agency for technical SEO and site performance is not about transferring responsibility; it is about establishing a shared framework for diagnosis and improvement. By specifying the scope (crawl budget, Core Web Vitals, network API), demanding evidence-based deliverables (CrUX data, crawl logs, performance budgets), and maintaining healthy skepticism toward guarantees, you set the foundation for a productive partnership.
Final checklist for your brief:
- Require a crawl budget analysis using Search Console and server logs.
- Request Core Web Vitals data from CrUX (field data), not just Lighthouse (lab data).
- Audit XML sitemap and robots.txt for contradictory signals.
- Demand a duplicate content and canonicalization report.
- Insist on intent mapping for all target keywords.
- Verify link building strategy excludes black-hat tactics (PBNs, paid links, automated outreach).
- Evaluate performance budget and CDN/network tier configuration.
- Review the audit report for severity ratings and actionable recommendations.

Reader Comments (0)