The Technical SEO Checklist: How to Brief an Agency and Get Real Results

The Technical SEO Checklist: How to Brief an Agency and Get Real Results

You’ve hired an SEO agency—or you’re about to. The brief lands in your inbox, and it’s full of promises about “increasing organic traffic by 300%” and “dominating the first page.” But here’s the uncomfortable truth: most SEO failures aren’t about bad strategy—they’re about poor execution buried in technical details that no one checked. A technical SEO audit isn’t a one-time checklist item; it’s the foundation that determines whether your site gets crawled, indexed, and ranked at all. If you don’t know how to brief an agency on crawl budget, Core Web Vitals, or canonical tags, you’re essentially handing over the keys to a car with no engine and hoping it drives.

This guide walks you through the critical technical SEO components you must include in your agency brief, explains what each one actually does, and flags the risks—black-hat links, broken redirects, neglected web vitals—that can turn a promising campaign into a penalty nightmare. By the end, you’ll have a checklist you can hand directly to your agency, along with the vocabulary to challenge their recommendations.

1. Start with a Technical SEO Audit: The Non-Negotiable Baseline

Before any content strategy or link building begins, your agency must run a comprehensive technical SEO audit. This isn’t a surface-level scan that checks for broken links and missing meta descriptions. A proper audit examines crawlability, indexation, site architecture, and server-side configurations that search engines use to understand your site.

What to include in your brief:

  • Request a full crawl report using tools like Screaming Frog, Sitebulb, or DeepCrawl. The agency should provide raw data, not just a summary.
  • Ask for a breakdown of crawl budget allocation: which pages are being crawled, how often, and which are being ignored. If your site has thousands of low-value pages (thin content, duplicate product variations, archive pages), they’re consuming crawl budget that should go to high-priority content.
  • Demand a list of all 4xx and 5xx errors, redirect chains, and orphan pages (pages with no internal links). These are silent traffic killers.
  • Require a Core Web Vitals report (LCP, CLS, FID/INP) with field data from Google Search Console, not just lab data from Lighthouse. Field data reflects real user experiences; lab data is synthetic.
What can go wrong: Agencies sometimes skip the audit entirely or run a generic tool that misses critical issues like soft 404s (pages that return a 200 status but contain no meaningful content) or misconfigured canonical tags that point to the wrong URL. Without a baseline, you’re optimizing blind.

2. Crawl Budget: Why It Matters and How to Brief It

Crawl budget is the number of pages Googlebot will crawl on your site within a given timeframe. It’s allocated based on your site’s authority, update frequency, and server response times. For large sites (10,000+ pages), mismanaging crawl budget means Google might never discover your best content.

How to brief your agency:

  • Ask them to analyze your current crawl stats in Google Search Console: how many pages are crawled per day, and what’s the average crawl latency?
  • Request a prioritized crawl strategy: which sections of your site should be crawled first? This usually means your highest-traffic pages, newly published content, and pages with recent updates.
  • Ensure they review your `robots.txt` file. A common mistake is accidentally blocking important pages (e.g., `/blog/` or `/products/`) while allowing endless session parameter URLs that waste crawl budget.
  • Ask for a sitemap.xml submission strategy. The sitemap should include only canonical, indexable pages—not filtered or paginated URLs.
Risk alert: If your agency sets an aggressive crawl rate on a shared hosting environment, it can slow down your server and trigger a Google crawl error. Conversely, if they set it too low, new content may take weeks to appear in search results. Monitor your server logs to confirm actual crawl behavior.

3. Core Web Vitals: The User Experience Gatekeeper

Core Web Vitals are a set of real-world metrics that measure loading performance (LCP), interactivity (FID, now INP), and visual stability (CLS). Since Google’s page experience update, these metrics directly influence rankings—especially for competitive queries.

What your agency must deliver:

  • A baseline measurement of LCP, CLS, and INP using field data from the Chrome User Experience Report (CrUX). Lab data from Lighthouse is useful for debugging but not for ranking assessment.
  • A prioritized remediation plan. For example:
  • LCP > 2.5 seconds: Optimize largest contentful element (often a hero image or video). Solutions include lazy loading, image compression, CDN delivery, or server-side rendering.
  • CLS > 0.1: Identify layout shift sources (ads, dynamically injected content, web fonts with FOUT). Fix by reserving space for ads, using `font-display: swap`, and avoiding layout shifts on user interaction.
  • INP > 200ms: Reduce JavaScript execution time, break up long tasks, and defer non-critical scripts.
  • A monitoring cadence: they should track vitals weekly and alert you regressions.
What can go wrong: Agencies sometimes focus on lab-based optimizations (e.g., minifying CSS) that don’t improve field data. Worse, they might implement aggressive lazy loading that breaks user experience or remove web fonts entirely, harming brand consistency. Always test changes on a staging environment before pushing live.

4. XML Sitemaps and robots.txt: The Gateway to Indexation

Your sitemap.xml and robots.txt file are the first things Googlebot reads when it visits your site. If either is misconfigured, your entire SEO strategy collapses.

Briefing checklist:

  • Sitemap.xml: Should contain only canonical URLs that you want indexed. Exclude paginated pages, parameter URLs, and thin content. Include a `<lastmod>` tag to signal freshness. Submit the sitemap via Google Search Console.
  • robots.txt: Use `Disallow` directives sparingly. Block only non-public areas (admin panels, staging sites, duplicate content directories). Do not block CSS, JavaScript, or image files unless you have a specific reason—blocking them can prevent Google from rendering your pages correctly.
  • Crawl-delay: If your site is on a slow server, you can set a crawl-delay directive, but it’s usually better to improve server response time instead.
Common mistakes:
  • Having multiple sitemaps with overlapping URLs.
  • Using `noindex` directives on pages that are in the sitemap (creates a conflict).
  • Blocking Googlebot via robots.txt while expecting it to crawl your site.

5. Canonical Tags and Duplicate Content: The Silent Traffic Leak

Duplicate content isn’t a penalty—it’s a dilution problem. When Google sees multiple URLs with identical or very similar content, it splits ranking signals across them, weakening your authority. The canonical tag tells Google which version is the primary one.

How to brief your agency:

  • Conduct a duplicate content audit: use a tool like Siteliner or Copyscape to find pages with >80% similarity.
  • For each duplicate group, the agency must identify the canonical URL. This should be the most authoritative or most-linked page.
  • Ensure the canonical tag is self-referencing on the chosen URL (e.g., `https://example.com/page/` points to itself).
  • Check for common canonical errors:
  • Canonical pointing to a non-indexable page (blocked by robots.txt or noindex).
  • Canonical pointing to a 404 page.
  • Missing canonical tags on paginated pages (use `rel="next"` and `rel="prev"` or self-referencing canonicals).
Risk scenario: Imagine your e-commerce site has product pages accessible via `/product/red-shoes/` and `/product/red-shoes?color=red`. Without a canonical tag, Google might index both, splitting link equity. Worse, if the agency accidentally sets the canonical to a non-canonical URL (e.g., a session-based URL), the page may not rank at all.

6. On-Page Optimization and Keyword Research: Beyond Meta Tags

On-page optimization has evolved from stuffing keywords into meta descriptions to aligning content with search intent. Your agency must demonstrate they understand the difference between informational, navigational, commercial, and transactional queries.

What to include in the brief:

  • Keyword research: Request a list of target keywords with search volume, difficulty, and intent classification. Avoid broad, high-volume keywords that are impossible to rank for unless your site has massive authority.
  • Intent mapping: Each page should target a specific intent. A blog post about “how to fix a leaky faucet” (informational) should not have a transactional call-to-action like “buy our plumbing service now.” Instead, it should link to a service page.
  • Content strategy: Ask for a content calendar that fills gaps in your existing coverage. Use tools like Ahrefs or SEMrush to find topics your competitors rank for but you don’t.
  • On-page elements: Title tags (under 60 characters), meta descriptions (under 160 characters), header tags (H1, H2, H3 with natural keyword placement), image alt text, and internal linking structure.
What can go wrong: Agencies sometimes over-optimize: stuffing exact-match keywords into every paragraph, creating unnatural anchor text, or writing meta descriptions that are just keyword lists. This can trigger algorithmic penalties (e.g., Google’s Panda update for thin content). Also, beware of “keyword cannibalization”—multiple pages targeting the same keyword with slight variations, which confuses Google.

7. Link Building: Quality Over Quantity, Always

Link building is the most risky component of SEO. One bad link from a spammy directory can trigger a manual penalty that takes months to recover from. Your agency must be transparent about their outreach methods.

How to brief your agency:

  • Define your link profile goals: Focus on earning links from authoritative, relevant sites in your industry. Tools like Moz’s Domain Authority or Majestic’s Trust Flow can help assess link quality, but no single metric is definitive.
  • Request a backlink audit: Identify toxic links (from PBNs, link farms, irrelevant directories) and disavow them via Google’s Disavow Tool. The agency should provide a list of disavowed domains.
  • Outreach guidelines: Require that guest posts, resource links, and digital PR are placed on sites with real traffic and editorial oversight. Avoid automated link exchange programs or paid links that violate Google’s Webmaster Guidelines.
  • Anchor text diversity: Natural link profiles have a mix of branded, naked URLs, and generic anchors (e.g., “click here”). Over-optimized exact-match anchors (e.g., “best SEO agency”) are a red flag.
What can go wrong: Black-hat link building—buying links from PBNs, using automated tools to create forum profiles, or participating in link schemes—can result in a Google manual action. Recovery requires a detailed reconsideration request and removal of all unnatural links. Some agencies promise “guaranteed first-page rankings” through black-hat tactics; this is a clear sign to walk away.

8. Monitoring, Reporting, and Accountability

Finally, your brief must specify how success is measured and reported. SEO is a long-term game, but you should see tangible progress within 3–6 months.

Reporting checklist:

  • Monthly reports that include:
  • Organic traffic (by landing page and keyword group)
  • Indexation status (pages indexed vs. submitted)
  • Core Web Vitals trends
  • Crawl errors and fixes
  • Backlink profile changes (new links gained, toxic links disavowed)
  • Goal alignment: Traffic is vanity without conversion. Ask for conversion tracking (form submissions, purchases, phone calls) tied to organic sessions.
  • Accountability: If the agency misses a deadline or fails to address a critical issue (e.g., a 50% drop in indexed pages), what’s the escalation process?
Red flags to watch for:
  • Reports that only show “increased traffic” without context (e.g., from what baseline?).
  • Excuses for poor performance that blame Google updates without data.
  • Refusal to share raw data or tool access.

Summary Checklist for Your Agency Brief

ComponentWhat to RequestRisk to Avoid
Technical SEO auditFull crawl report, Core Web Vitals field data, crawl budget analysisSkipping audit, using only lab data
Crawl budgetCrawl stats from GSC, prioritized crawl strategy, robots.txt reviewBlocking important pages, over-aggressive crawl
Core Web VitalsBaseline field data, remediation plan, weekly monitoringLab-only optimizations, breaking UX
Sitemap & robots.txtClean sitemap with canonical URLs, minimal robots.txt blocksConflicting directives, blocked resources
Canonical tagsDuplicate content audit, self-referencing canonicalsCanonical to non-indexable pages
On-page optimizationIntent-mapped keywords, content strategy, proper meta tagsOver-optimization, keyword cannibalization
Link buildingBacklink audit, toxic disavowal, quality outreach guidelinesBlack-hat links, manual penalties
ReportingMonthly traffic, indexation, vitals, backlink changesVanity metrics, no conversion tracking

Final thought: A great SEO agency will push back on your brief if they think parts of it are unnecessary or harmful. That’s a good sign. But they should also welcome transparency and data-sharing. If they’re defensive about sharing raw data or dismissive of technical audits, find another partner. Your site’s health—and your organic revenue—depends on it.

For more on how to evaluate agency performance, read our guide on Technical SEO and Site Health. And if you’re building an in-house SEO team, start with Expert Technical SEO Services for Site Health & Performance Optimization.

Wendy Garza

Wendy Garza

Technical SEO Specialist

Elena focuses on site architecture, crawl efficiency, and structured data. She breaks down complex technical issues into clear, actionable steps.

Reader Comments (0)

Leave a comment