The Technical SEO Checklist: How to Brief an Agency and Get Real Results
You’ve hired an SEO agency—or you’re about to. The brief lands in your inbox, and it’s full of promises about “increasing organic traffic by 300%” and “dominating the first page.” But here’s the uncomfortable truth: most SEO failures aren’t about bad strategy—they’re about poor execution buried in technical details that no one checked. A technical SEO audit isn’t a one-time checklist item; it’s the foundation that determines whether your site gets crawled, indexed, and ranked at all. If you don’t know how to brief an agency on crawl budget, Core Web Vitals, or canonical tags, you’re essentially handing over the keys to a car with no engine and hoping it drives.
This guide walks you through the critical technical SEO components you must include in your agency brief, explains what each one actually does, and flags the risks—black-hat links, broken redirects, neglected web vitals—that can turn a promising campaign into a penalty nightmare. By the end, you’ll have a checklist you can hand directly to your agency, along with the vocabulary to challenge their recommendations.
1. Start with a Technical SEO Audit: The Non-Negotiable Baseline
Before any content strategy or link building begins, your agency must run a comprehensive technical SEO audit. This isn’t a surface-level scan that checks for broken links and missing meta descriptions. A proper audit examines crawlability, indexation, site architecture, and server-side configurations that search engines use to understand your site.
What to include in your brief:
- Request a full crawl report using tools like Screaming Frog, Sitebulb, or DeepCrawl. The agency should provide raw data, not just a summary.
- Ask for a breakdown of crawl budget allocation: which pages are being crawled, how often, and which are being ignored. If your site has thousands of low-value pages (thin content, duplicate product variations, archive pages), they’re consuming crawl budget that should go to high-priority content.
- Demand a list of all 4xx and 5xx errors, redirect chains, and orphan pages (pages with no internal links). These are silent traffic killers.
- Require a Core Web Vitals report (LCP, CLS, FID/INP) with field data from Google Search Console, not just lab data from Lighthouse. Field data reflects real user experiences; lab data is synthetic.
2. Crawl Budget: Why It Matters and How to Brief It
Crawl budget is the number of pages Googlebot will crawl on your site within a given timeframe. It’s allocated based on your site’s authority, update frequency, and server response times. For large sites (10,000+ pages), mismanaging crawl budget means Google might never discover your best content.
How to brief your agency:
- Ask them to analyze your current crawl stats in Google Search Console: how many pages are crawled per day, and what’s the average crawl latency?
- Request a prioritized crawl strategy: which sections of your site should be crawled first? This usually means your highest-traffic pages, newly published content, and pages with recent updates.
- Ensure they review your `robots.txt` file. A common mistake is accidentally blocking important pages (e.g., `/blog/` or `/products/`) while allowing endless session parameter URLs that waste crawl budget.
- Ask for a sitemap.xml submission strategy. The sitemap should include only canonical, indexable pages—not filtered or paginated URLs.
3. Core Web Vitals: The User Experience Gatekeeper
Core Web Vitals are a set of real-world metrics that measure loading performance (LCP), interactivity (FID, now INP), and visual stability (CLS). Since Google’s page experience update, these metrics directly influence rankings—especially for competitive queries.

What your agency must deliver:
- A baseline measurement of LCP, CLS, and INP using field data from the Chrome User Experience Report (CrUX). Lab data from Lighthouse is useful for debugging but not for ranking assessment.
- A prioritized remediation plan. For example:
- LCP > 2.5 seconds: Optimize largest contentful element (often a hero image or video). Solutions include lazy loading, image compression, CDN delivery, or server-side rendering.
- CLS > 0.1: Identify layout shift sources (ads, dynamically injected content, web fonts with FOUT). Fix by reserving space for ads, using `font-display: swap`, and avoiding layout shifts on user interaction.
- INP > 200ms: Reduce JavaScript execution time, break up long tasks, and defer non-critical scripts.
- A monitoring cadence: they should track vitals weekly and alert you regressions.
4. XML Sitemaps and robots.txt: The Gateway to Indexation
Your sitemap.xml and robots.txt file are the first things Googlebot reads when it visits your site. If either is misconfigured, your entire SEO strategy collapses.
Briefing checklist:
- Sitemap.xml: Should contain only canonical URLs that you want indexed. Exclude paginated pages, parameter URLs, and thin content. Include a `<lastmod>` tag to signal freshness. Submit the sitemap via Google Search Console.
- robots.txt: Use `Disallow` directives sparingly. Block only non-public areas (admin panels, staging sites, duplicate content directories). Do not block CSS, JavaScript, or image files unless you have a specific reason—blocking them can prevent Google from rendering your pages correctly.
- Crawl-delay: If your site is on a slow server, you can set a crawl-delay directive, but it’s usually better to improve server response time instead.
- Having multiple sitemaps with overlapping URLs.
- Using `noindex` directives on pages that are in the sitemap (creates a conflict).
- Blocking Googlebot via robots.txt while expecting it to crawl your site.
5. Canonical Tags and Duplicate Content: The Silent Traffic Leak
Duplicate content isn’t a penalty—it’s a dilution problem. When Google sees multiple URLs with identical or very similar content, it splits ranking signals across them, weakening your authority. The canonical tag tells Google which version is the primary one.
How to brief your agency:
- Conduct a duplicate content audit: use a tool like Siteliner or Copyscape to find pages with >80% similarity.
- For each duplicate group, the agency must identify the canonical URL. This should be the most authoritative or most-linked page.
- Ensure the canonical tag is self-referencing on the chosen URL (e.g., `https://example.com/page/` points to itself).
- Check for common canonical errors:
- Canonical pointing to a non-indexable page (blocked by robots.txt or noindex).
- Canonical pointing to a 404 page.
- Missing canonical tags on paginated pages (use `rel="next"` and `rel="prev"` or self-referencing canonicals).
6. On-Page Optimization and Keyword Research: Beyond Meta Tags
On-page optimization has evolved from stuffing keywords into meta descriptions to aligning content with search intent. Your agency must demonstrate they understand the difference between informational, navigational, commercial, and transactional queries.
What to include in the brief:
- Keyword research: Request a list of target keywords with search volume, difficulty, and intent classification. Avoid broad, high-volume keywords that are impossible to rank for unless your site has massive authority.
- Intent mapping: Each page should target a specific intent. A blog post about “how to fix a leaky faucet” (informational) should not have a transactional call-to-action like “buy our plumbing service now.” Instead, it should link to a service page.
- Content strategy: Ask for a content calendar that fills gaps in your existing coverage. Use tools like Ahrefs or SEMrush to find topics your competitors rank for but you don’t.
- On-page elements: Title tags (under 60 characters), meta descriptions (under 160 characters), header tags (H1, H2, H3 with natural keyword placement), image alt text, and internal linking structure.

7. Link Building: Quality Over Quantity, Always
Link building is the most risky component of SEO. One bad link from a spammy directory can trigger a manual penalty that takes months to recover from. Your agency must be transparent about their outreach methods.
How to brief your agency:
- Define your link profile goals: Focus on earning links from authoritative, relevant sites in your industry. Tools like Moz’s Domain Authority or Majestic’s Trust Flow can help assess link quality, but no single metric is definitive.
- Request a backlink audit: Identify toxic links (from PBNs, link farms, irrelevant directories) and disavow them via Google’s Disavow Tool. The agency should provide a list of disavowed domains.
- Outreach guidelines: Require that guest posts, resource links, and digital PR are placed on sites with real traffic and editorial oversight. Avoid automated link exchange programs or paid links that violate Google’s Webmaster Guidelines.
- Anchor text diversity: Natural link profiles have a mix of branded, naked URLs, and generic anchors (e.g., “click here”). Over-optimized exact-match anchors (e.g., “best SEO agency”) are a red flag.
8. Monitoring, Reporting, and Accountability
Finally, your brief must specify how success is measured and reported. SEO is a long-term game, but you should see tangible progress within 3–6 months.
Reporting checklist:
- Monthly reports that include:
- Organic traffic (by landing page and keyword group)
- Indexation status (pages indexed vs. submitted)
- Core Web Vitals trends
- Crawl errors and fixes
- Backlink profile changes (new links gained, toxic links disavowed)
- Goal alignment: Traffic is vanity without conversion. Ask for conversion tracking (form submissions, purchases, phone calls) tied to organic sessions.
- Accountability: If the agency misses a deadline or fails to address a critical issue (e.g., a 50% drop in indexed pages), what’s the escalation process?
- Reports that only show “increased traffic” without context (e.g., from what baseline?).
- Excuses for poor performance that blame Google updates without data.
- Refusal to share raw data or tool access.
Summary Checklist for Your Agency Brief
| Component | What to Request | Risk to Avoid |
|---|---|---|
| Technical SEO audit | Full crawl report, Core Web Vitals field data, crawl budget analysis | Skipping audit, using only lab data |
| Crawl budget | Crawl stats from GSC, prioritized crawl strategy, robots.txt review | Blocking important pages, over-aggressive crawl |
| Core Web Vitals | Baseline field data, remediation plan, weekly monitoring | Lab-only optimizations, breaking UX |
| Sitemap & robots.txt | Clean sitemap with canonical URLs, minimal robots.txt blocks | Conflicting directives, blocked resources |
| Canonical tags | Duplicate content audit, self-referencing canonicals | Canonical to non-indexable pages |
| On-page optimization | Intent-mapped keywords, content strategy, proper meta tags | Over-optimization, keyword cannibalization |
| Link building | Backlink audit, toxic disavowal, quality outreach guidelines | Black-hat links, manual penalties |
| Reporting | Monthly traffic, indexation, vitals, backlink changes | Vanity metrics, no conversion tracking |
Final thought: A great SEO agency will push back on your brief if they think parts of it are unnecessary or harmful. That’s a good sign. But they should also welcome transparency and data-sharing. If they’re defensive about sharing raw data or dismissive of technical audits, find another partner. Your site’s health—and your organic revenue—depends on it.
For more on how to evaluate agency performance, read our guide on Technical SEO and Site Health. And if you’re building an in-house SEO team, start with Expert Technical SEO Services for Site Health & Performance Optimization.

Reader Comments (0)