The Technical SEO Checklist: What a Top-Tier Agency Actually Checks (And What They Don't Tell You)

The Technical SEO Checklist: What a Top-Tier Agency Actually Checks (And What They Don't Tell You)

You've hired an SEO agency, or you're about to. The pitch sounded polished—"comprehensive technical audit," "on-page optimization," "site performance." But the reality is that many agencies treat technical SEO as a one-time fix, not an ongoing governance process. They run a crawler, flag a few 404s, call it a day, and move on to link building. If you want results that stick, you need a checklist that separates a genuine technical partner from a report-factory.

This guide walks through exactly what a top-tier agency should be auditing, why each element matters, and where most agencies cut corners. We'll cover the crawl budget management, Core Web Vitals, content duplication, and link profile risks that make or break a campaign. No fluff, no guaranteed outcomes—just the operational reality of site health.

1. Crawl Budget & Server Response: The Foundation Nobody Talks About

Before any keyword research or content strategy begins, your agency must understand how search engines interact with your server. This isn't just about having a `robots.txt` file—it's about managing crawl allocation so Googlebot spends its time on pages that matter.

A common failure point: agencies block important resources via `robots.txt` without realizing it. For example, if your CSS or JavaScript files are disallowed, Google may not render your pages correctly, and you'll see "LCP issues" in Core Web Vitals that are actually misconfigurations. Worse, they might leave staging or development subdomains open to indexing, diluting your link profile.

What a proper audit looks like:

Crawl FactorWhat the Agency ChecksCommon Mistake
Crawl rateServer response time (TTFB), crawl stats in GSCIgnoring rate limiting; not adjusting for server load
URL parametersParameter handling for session IDs, filters, paginationAllowing infinite crawl paths (e.g., /?page=1&sort=price&color=blue)
Orphaned pagesPages with no internal linksOnly checking linked pages; missing content islands
XML sitemapInclusion of canonical URLs, lastmod accuracySubmitting sitemaps with non-indexable URLs or future dates

A top-tier agency will also check your `robots.txt` for directives that accidentally block Googlebot from accessing critical resources. They should show you the test in Google's Robots Testing Tool, not just hand you a report.

Action step: Ask your agency for a crawl budget analysis that includes server logs or Google Search Console data. If they can't produce it, they're guessing.

2. Duplicate Content & Canonicalization: The Silent Performance Killer

Duplicate content isn't just about having the same paragraph on two pages. It's about how your CMS generates URLs, how parameters create near-identical pages, and how pagination interacts with indexation. A proper audit will map every URL variant and decide which one should be canonical.

Consider an e-commerce site: /product/blue-widget, /product/blue-widget?color=blue, /product/blue-widget?size=large, and /product/blue-widget?color=blue&size=large. Without proper `rel="canonical"` tags, Google may see four separate pages competing for the same keyword. The result? None of them rank well because authority is split.

What the agency should be doing:

  • Run a full site crawl and identify all URLs that return 200 status.
  • Group URLs by content similarity using a tool like Screaming Frog or DeepCrawl.
  • Verify that every page has a self-referencing canonical tag (or a cross-domain canonical if syndicated content exists).
  • Check for pagination issues: if you have /category/page/2, the canonical should point to /category/ unless you're using rel="next"/"prev" properly.
A common practice to avoid is using canonicals to "steal" authority from other domains. Google treats cross-domain canonicals as strong signals, not directives, and misusing them can lead to penalties. An honest agency will tell you that canonical tags are suggestions, not commands.

Risk callout: If your agency recommends blanket canonicalization to "solve duplicate content" without analyzing each case, that approach may cause issues—it can lead to losing traffic from pages that should be indexed. A thorough analysis is essential.

3. Core Web Vitals: Beyond the Lighthouse Score

Core Web Vitals (LCP, FID/INP, CLS) are not a checkbox. They are a continuous monitoring process. Many agencies run a single Lighthouse audit, get a green score, and declare victory. But Lighthouse tests in a controlled environment—your real users on mobile with poor connections tell a different story.

A top-tier agency will:

  • Pull real-user monitoring (RUM) data from CrUX (Chrome User Experience Report) to see actual field metrics.
  • Identify which page templates are failing (e.g., product pages vs. blog posts).
  • Diagnose the root cause: is LCP slow because of a hero image, a web font, or a third-party script?
  • Prioritize fixes by traffic impact. A slow page that gets 10 visits a month is less urgent than a slow page that gets 10,000.
What a proper audit table looks like:

MetricField Data (CrUX)Lab Data (Lighthouse)Root CauseFix Priority
LCP3.2s (poor)2.1s (needs improvement)Unoptimized hero image, render-blocking fontHigh
INP280ms (needs improvement)180ms (good)Third-party chat widget loading on all pagesHigh
CLS0.15 (needs improvement)0.05 (good)Dynamic ad slots without dimensionsMedium

If your agency only provides lab data without field data, they are missing half the picture. Ask for a CrUX report.

Common mistake: Agencies recommend lazy-loading everything to fix CLS. That can actually make LCP worse if the lazy-loading script itself is render-blocking. A good agency will test each change in a staging environment before pushing to production.

4. On-Page Optimization: Intent Mapping, Not Keyword Stuffing

On-page optimization has evolved beyond stuffing "best SEO agency" into every H2. The modern approach is intent mapping: understanding what the user wants at each stage of their journey and structuring content accordingly.

A proper on-page audit covers:

  • Title tags and meta descriptions: Are they unique? Do they match the search intent? Do they include the primary keyword naturally?
  • Heading structure: Is there a single H1 that describes the page topic? Are H2s used for subtopics? Are H3s nested correctly?
  • Content quality: Is the content comprehensive enough to satisfy the query? Does it include related questions, examples, or data?
  • Internal linking: Are you linking to relevant pages within your site? Are anchor texts descriptive but not over-optimized?
  • Schema markup: Is structured data present for the content type (e.g., Article, Product, FAQ)? Is it valid according to Google's testing tool?
The agency should not:
  • Recommend keyword stuffing in headings or body copy.
  • Suggest exact-match anchor text for every internal link.
  • Ignore user intent in favor of high-volume keywords.
A common agency shortcut is to run a tool like Surfer SEO or Clearscope, generate a content brief, and call it "on-page optimization." That's a starting point, not a strategy. A real audit involves manual review of search results, competitor analysis, and understanding your brand's unique angle.

Action step: Ask your agency for a sample page audit that includes intent analysis. If they can't explain why a page should target a specific keyword, they are guessing.

5. Link Building & Backlink Profile: The Risk-Reward Reality

Link building is a nuanced aspect of SEO. Some agencies promise "100 high-DA backlinks in 30 days." Others refuse to do any outreach, claiming it's all about content. The truth lies somewhere in between, but the risks are real.

A top-tier agency will:

  • Audit your existing backlink profile using tools like Ahrefs, Majestic, or Moz. They'll look for toxic links, spammy domains, and unnatural anchor text distribution.
  • Disavow harmful links if necessary, but only after careful analysis. Disavowing a link that is actually helping you may affect your rankings, so each case should be evaluated individually.
  • Build links through genuine outreach—guest posts, resource pages, broken link building, or digital PR. They should not buy links from PBNs (private blog networks) or use automated tools.
  • Track link quality using metrics like Trust Flow, Domain Rating, and relevance. A link from a high-DA but irrelevant site (e.g., a gambling site linking to a dental practice) may not be beneficial and could pose risks.
Risk callout: Black-hat link building is not safe. Google's algorithms (like Penguin) are good at detecting unnatural patterns. If your agency claims they can guarantee no penalties, that is unrealistic. Every link building strategy carries some level of risk; the goal is to minimize it through quality and relevance.

What a backlink audit table looks like:

DomainTrust FlowDomain RatingRelevanceAction
example-spam-site.com25Low (gambling)Disavow
industry-blog.com2540High (tech)Keep
guest-post-network.org830Medium (general)Investigate

Red flag: If your agency asks for a budget for "link packages" or "guaranteed placements," walk away. Real link building is time-intensive and results vary.

6. Content Strategy & Keyword Research: The Intersection of Data and Creativity

Keyword research is not just about finding high-volume terms. It's about understanding what your audience actually searches for and creating content that answers those questions better than competitors.

A proper process involves:

  • Identifying seed keywords based on your products, services, and brand.
  • Expanding into long-tail variations using tools like Keyword Planner, SEMrush, or AnswerThePublic.
  • Analyzing search intent for each keyword: informational, navigational, commercial, or transactional.
  • Mapping keywords to existing pages and identifying gaps where new content is needed.
  • Creating a content calendar that prioritizes topics by search volume, competition, and business value.
The agency should avoid:
  • Targeting keywords that are too competitive without a realistic plan.
  • Creating content for the sake of content (e.g., "10 tips for better sleep" on a B2B software blog).
  • Ignoring existing content that could be optimized instead of creating new pages.
Action step: Ask your agency for a keyword-to-content map. This shows which keywords are assigned to which pages and why. If it's missing, they are flying blind.

7. The Final Checklist: What to Expect from Your Agency

Before you sign a contract or renew a retainer, use this checklist to evaluate your agency's technical SEO approach:

  • They provided a crawl budget analysis with server log or GSC data.
  • They identified and fixed canonical tag issues across all page templates.
  • They ran Core Web Vitals analysis using field data (CrUX), not just Lighthouse.
  • They audited your backlink profile and disavowed toxic links (if needed).
  • They created an intent-based keyword map, not a list of high-volume terms.
  • They explained the risks of link building and have a documented outreach process.
  • They do not promise guaranteed rankings or instant results.
Final thought: Technical SEO is not a one-time project. It's a governance process that requires ongoing monitoring, testing, and adjustment. The best agencies treat it as a partnership, not a transaction. If yours is just running reports and sending invoices, it's time to ask harder questions.

For more on how to evaluate your site's health, check out our guide on technical SEO audits and Core Web Vitals optimization.

Wendy Garza

Wendy Garza

Technical SEO Specialist

Elena focuses on site architecture, crawl efficiency, and structured data. She breaks down complex technical issues into clear, actionable steps.

Reader Comments (0)

Leave a comment