The Technical SEO & Site Health Checklist: What Every Agency Brief Must Cover

The Technical SEO & Site Health Checklist: What Every Agency Brief Must Cover

You’ve decided to hire an SEO agency, or maybe you’re the one building the internal brief. Either way, the conversation usually starts with rankings—but it should start with crawlability. Before any keyword strategy or content plan can deliver, Googlebot needs to actually access, parse, and index your pages. That’s where technical SEO and site health optimization enter the picture. This checklist is designed to help you brief an agency (or evaluate their proposal) on the technical fundamentals, without falling for promises that violate search engine guidelines.

Why Technical SEO Comes First

Search engines operate like librarians: they send out crawlers (Googlebot, Bingbot) to discover and read every page on your site. If those crawlers hit a 404, a redirect chain, or a page that takes eight seconds to load, they won’t index it properly—or worse, they’ll drop it from the index entirely. This is the domain of technical SEO: ensuring that the infrastructure of your website supports discovery, rendering, and ranking.

A proper technical SEO audit examines crawl budget allocation, server response codes, XML sitemap structure, robots.txt directives, canonical tags, duplicate content signals, and Core Web Vitals. Each of these elements can either accelerate or block organic growth. For example, a misconfigured robots.txt file that accidentally disallows your blog section can kill months of content investment. An agency that skips the technical audit and jumps straight to link building is building a house on sand.

The Crawl Budget Reality Check

Crawl budget refers to the number of URLs Googlebot will crawl on your site within a given timeframe. Large sites (10,000+ pages) often struggle with crawl waste—Googlebot spends time on thin, duplicate, or low-value pages while ignoring your money pages. Smaller sites rarely hit crawl limits, but they can still suffer from poor crawl efficiency if internal linking is broken or sitemaps are outdated.

When briefing an agency, ask how they plan to optimize crawl budget. A credible response will include:

  • Identifying and consolidating low-value pages (thin content, parameter-heavy URLs, archived duplicates)
  • Improving internal linking structure to guide crawlers toward priority pages
  • Updating the XML sitemap to reflect only canonical, indexable pages
  • Using robots.txt to block crawlers from admin sections, staging environments, and search result pages
If the agency says “we don’t worry about crawl budget, it’s automatic,” that’s a red flag. Crawl budget management is not about tricking Google into crawling more—it’s about helping Google crawl the right pages.

Core Web Vitals: More Than a Speed Score

Core Web Vitals (LCP, CLS, FID/INP) are now ranking signals, but they’re also user experience metrics. An agency that only runs a PageSpeed Insights test and calls it done is underselling the work. Real optimization requires:

MetricTargetCommon IssuesOptimization Approach
Largest Contentful Paint (LCP)≤ 2.5 secondsSlow server response, render-blocking resources, large imagesServer-side caching, image compression, CDN, lazy loading of non-critical assets
Cumulative Layout Shift (CLS)≤ 0.1Ads without reserved space, dynamic content injection, web fonts causing reflowSet explicit dimensions for media, reserve ad slots, use font-display: swap
Interaction to Next Paint (INP)≤ 200 msHeavy JavaScript execution, long tasks, third-party scriptsCode splitting, defer non-critical JS, audit third-party widgets

A competent technical SEO audit will include a lab-based and field-based assessment of these metrics. Field data comes from Chrome User Experience Report (CrUX), which reflects real user experiences—not just your test environment. If the agency only shows you lab results, ask for the field data.

On-Page Optimization: Beyond Meta Tags

On-page optimization is often misunderstood as simply writing title tags and meta descriptions. In reality, it’s a structural discipline that includes header hierarchy (H1–H6), semantic HTML5 elements, internal linking anchor text, and schema markup. When briefing an agency, request a page-level audit that maps each page’s target keyword to its content, heading structure, and supporting internal links.

A thorough on-page audit should also check for:

  • Duplicate title tags and meta descriptions across similar pages
  • Missing or broken H1 tags (each page needs exactly one)
  • Thin content (pages with fewer than 300 words that don’t serve a clear user need)
  • Missing alt text on images, which affects image search and accessibility
  • Canonical tag misconfigurations that send conflicting signals about which URL is the primary version
Intent mapping is the next layer: does the page match what the searcher actually wants? A page optimized for “buy running shoes” that serves a blog post about “how to choose running shoes” will fail regardless of technical perfection. The agency should demonstrate how they map keywords to informational, navigational, commercial, or transactional intent.

Link Building with Risk Awareness

Link building remains one of the most effective ranking levers—and one of the most dangerous if done wrong. Black-hat tactics like private blog networks (PBNs), paid links, or automated outreach to low-quality directories can trigger manual penalties or algorithmic demotions. A risk-aware agency will:

  • Audit your existing backlink profile using tools like Ahrefs, Majestic, or Semrush to identify toxic links
  • Disavow harmful links through Google Search Console, but only after careful analysis (disavowing good links can hurt your profile)
  • Focus on earned links through content partnerships, digital PR, and guest posting on relevant, authoritative sites
  • Monitor Trust Flow and Domain Authority as directional metrics, not guarantees—these are third-party scores, not Google metrics
When briefing a link building campaign, ask for the agency’s outreach methodology. Do they target sites with real editorial standards? Do they have a process for verifying that a potential link partner isn’t part of a link exchange scheme? Avoid any agency that promises a specific number of links per month without disclosing the quality thresholds.

The XML Sitemap and Robots.txt Check

These two files are the gatekeepers of your site’s crawlability. A misconfigured robots.txt can block entire sections; an outdated XML sitemap can waste crawl budget. Here’s what a proper audit should cover:

Robots.txt best practices:

  • Allow access to CSS, JS, and image files (blocking these can break rendering)
  • Disallow access to admin panels, staging environments, and thank-you pages
  • Test the file using Google’s robots.txt tester in Search Console
  • Avoid using “Disallow: /” unless you intentionally want to block all crawlers
XML sitemap best practices:
  • Include only canonical, indexable pages (no paginated archive pages, no filter URLs)
  • Keep file size under 50 MB (uncompressed) and under 50,000 URLs
  • Update the sitemap whenever you publish or remove significant content
  • Submit the sitemap in Google Search Console and Bing Webmaster Tools
If the agency doesn’t mention these files in their initial audit, ask why. They are foundational.

Red Flags in Agency Briefs

Not all agencies operate with the same ethical standards. Here are warning signs to watch for:

Red FlagWhy It’s DangerousWhat to Ask Instead
“We guarantee first-page rankings”No one controls Google’s algorithm; guarantees are impossible“What’s your track record for improving organic visibility across similar industries?”
“We use a secret link-building method”Likely black-hat tactics that can lead to penalties“Can you share examples of sites you’ve built links from, and your outreach process?”
“We don’t need access to your server”Technical SEO requires server-level access for proper diagnosis“How do you plan to check server response codes, redirects, and Core Web Vitals?”
“Core Web Vitals are a one-time fix”They require ongoing monitoring as site content and third-party scripts change“What’s your process for monitoring and maintaining Core Web Vitals post-optimization?”

Final Checklist for Your Agency Brief

Before signing a contract, ensure the agency’s proposal includes these technical deliverables:

  1. Full technical SEO audit covering crawlability, indexation, redirects, broken links, and server logs analysis
  2. Core Web Vitals assessment using both lab and field data, with a remediation plan
  3. XML sitemap and robots.txt audit with specific recommendations
  4. On-page optimization per page, including intent mapping and schema markup
  5. Backlink profile analysis with toxic link identification and disavow strategy
  6. Ongoing monitoring of crawl stats, index coverage, and Core Web Vitals in Search Console
Technical SEO is not glamorous, but it’s the foundation that every other SEO effort depends on. A brief that skips these steps is a brief that’s setting itself up for frustration. Start with the infrastructure, and the rankings will follow—provided you have the patience to let the work compound over time.

For more on how to evaluate an agency’s approach to site health optimization, check our guide on running your own pre-audit. And if you’re building content strategy alongside technical fixes, our on-page optimization checklist covers the editorial side of the equation.

Wendy Garza

Wendy Garza

Technical SEO Specialist

Elena focuses on site architecture, crawl efficiency, and structured data. She breaks down complex technical issues into clear, actionable steps.

Reader Comments (0)

Leave a comment