The Technical SEO & Site Health Checklist: What Your Agency Should Be Doing (And What to Watch For)

The Technical SEO & Site Health Checklist: What Your Agency Should Be Doing (And What to Watch For)

You’ve hired an SEO agency, or you’re about to. The promises sound good: “We’ll get you to the top of Google.” But the reality is that rankings are built on a foundation of technical health, not just keywords and backlinks. If your site’s crawl budget is wasted, your Core Web Vitals are poor, or you have duplicate content issues, no amount of content strategy will save you. This checklist walks you through the essential technical SEO services your agency should provide—and the risks to avoid.

1. The Technical SEO Audit: Your Starting Point

Every serious engagement begins with a technical SEO audit. This isn’t a one-time “we checked your site” report. It’s a deep, systematic analysis of how search engines interact with your website.

What a proper audit covers:

  • Crawlability: Can Googlebot access your important pages? Are you blocking critical resources via `robots.txt`?
  • Indexability: Are your pages being indexed correctly? Are there orphan pages (no internal links) or pages that should be noindexed?
  • Site structure: Is your URL hierarchy logical? Are you using breadcrumbs and internal linking effectively?
  • Duplicate content: Are there multiple versions of the same page (e.g., `www` vs. non-`www`, HTTP vs. HTTPS, trailing slash variations)? Your agency should implement canonical tags to signal the preferred URL.
  • Core Web Vitals: Are your LCP (Largest Contentful Paint), CLS (Cumulative Layout Shift), and INP (Interaction to Next Paint) within Google’s “good” thresholds? Poor performance here directly impacts rankings.
The risk: A lazy audit only runs a basic tool and lists errors without context. Your agency should explain why a problem exists and how to fix it. If they hand you a 50-page PDF with no prioritization, ask for a triage.

2. Crawl Budget: Don’t Waste Google’s Time

Your site has a crawl budget—the number of pages Googlebot will crawl in a given period. If you have 10,000 pages but only 200 are important, you need to manage that budget.

What your agency should do:

  • Optimize your XML sitemap: Submit a clean, up-to-date `sitemap.xml` that lists only canonical, indexable pages. Remove redirects, 404s, and thin content.
  • Tune robots.txt: Ensure you’re not blocking critical CSS, JS, or image files (this can break rendering and hurt Core Web Vitals). Block low-value sections like admin panels or duplicate tag pages.
  • Fix redirect chains: A page that redirects three times before reaching the final URL wastes crawl budget. Your agency should audit and flatten these.
  • Monitor crawl stats in Google Search Console: Watch for spikes or drops in crawl rate that might indicate server issues or Googlebot being blocked.
The risk: Over-optimizing crawl budget (e.g., blocking too many pages) can accidentally hide important content. Your agency should use a phased approach: audit, prioritize, then implement.

3. Core Web Vitals: The User Experience Metric That Matters

Google’s Core Web Vitals are now ranking signals. They measure real-world user experience: loading speed, visual stability, and interactivity. Your agency should be monitoring these via the Core Web Vitals report in Search Console and using tools like Lighthouse or PageSpeed Insights.

What a good agency does:

  • Diagnoses LCP issues: Is your hero image too large? Is your server response time slow? They should identify the bottleneck (e.g., render-blocking resources, unoptimized images).
  • Fixes CLS: Are ads, embeds, or fonts shifting the layout after load? They should reserve space for dynamic elements and use `font-display: swap`.
  • Optimizes INP: Are click events delayed by heavy JavaScript? They should defer non-critical scripts and use web workers where possible.
The risk: Some agencies promise “instant” Core Web Vitals fixes by applying generic plugins or CDNs. That’s not enough. You need a custom audit of your specific stack (CMS, hosting, theme). Poorly implemented fixes (e.g., lazy-loading above-the-fold images) can actually hurt performance.

4. Duplicate Content & Canonicalization: The Silent Ranking Killer

Duplicate content isn’t just a technical nuisance—it can dilute your ranking signals. If Google finds the same content on multiple URLs, it has to guess which one to rank. Your agency should use canonical tags to tell Google which version is the “master.”

What to expect:

  • Audit for duplicates: Check for www vs. non-www, HTTP vs. HTTPS, trailing slashes, and URL parameters (e.g., `?utm_source=...`). These should be handled via 301 redirects or canonical tags.
  • Handle pagination: For blog archives or product listings, use `rel="prev"` and `rel="next"` (or a “view all” page with a canonical).
  • Syndicated content: If you republish content elsewhere (e.g., Medium, LinkedIn), your agency should add a canonical tag pointing back to your original URL.
The risk: Misusing canonical tags (e.g., pointing to a different page entirely) can cause Google to ignore the canonicalized page. Also, relying solely on canonical tags without fixing underlying URL issues is a band-aid, not a solution.

5. On-Page Optimization & Keyword Research: Beyond the Meta Tags

On-page optimization is more than stuffing keywords into title tags. It’s about aligning your content with search intent—what the user actually wants when they type a query.

What your agency should deliver:

  • Keyword research with intent mapping: Not just “high volume keywords.” They should categorize terms by intent: informational (blog posts), navigational (brand searches), commercial (comparisons), transactional (buy now). Each page should target one primary intent.
  • Content strategy: A plan for which pages to create, update, or consolidate. This includes identifying keyword gaps and opportunities for topic clusters.
  • Technical on-page elements: Optimized title tags, meta descriptions, header tags (H1-H3), internal links, image alt text, and structured data (schema markup) for rich snippets.
The risk: Agencies that focus only on keyword density or “exact match” URLs are stuck in 2015. Modern on-page SEO is about relevance and user experience. Also, avoid agencies that promise “instant rankings” for competitive keywords—that’s a red flag for black-hat tactics.

6. Link Building: Quality Over Quantity (And Beware of Black Hat)

Link building remains a strong ranking signal, but it’s also the most dangerous area for shortcuts. Black-hat links—paid links, private blog networks (PBNs), spammy directories—can trigger a manual penalty from Google.

What a legitimate agency does:

  • Audits your backlink profile: They should analyze your existing links for toxic domains (low Trust Flow, high spam score) and disavow harmful ones.
  • Builds contextually relevant links: Guest posts on authoritative industry sites, resource page links, broken link building, and digital PR (earned media).
  • Focuses on Domain Authority (DA) and Trust Flow (TF) as metrics, not goals: They should target sites that are relevant to your niche, not just high DA for the sake of it.
The risk: An agency that guarantees a certain number of backlinks per month without specifying quality or relevance is likely using black-hat methods. Also, be wary of “link packages” that promise instant DA boosts—those are usually from link farms.

7. Reporting & Ongoing Monitoring: The Proof Is in the Data

Your agency should provide transparent, actionable reports that go beyond “we added 10 backlinks this month.” They should show how technical changes impact organic traffic, conversions, and rankings.

What to look for in reports:

  • Crawl and index status: How many pages are indexed? Are there new crawl errors?
  • Core Web Vitals trends: Are your LCP, CLS, and INP improving or degrading over time?
  • Keyword rankings: Tracked against baseline (before the engagement) and with context (seasonality, algorithm updates).
  • Backlink profile changes: New links acquired, lost links, and any toxic link alerts.
  • Organic traffic and conversions: Not just sessions, but goal completions (e.g., form fills, purchases).
The risk: Vanity metrics (e.g., total backlinks, keyword rankings for non-commercial terms) can hide real problems. Your agency should be able to answer: “Did our technical fixes lead to more revenue?” If they can’t, ask for a deeper analysis.

Summary: Your Actionable Checklist

AreaWhat Your Agency Should DoRed Flags to Watch
Technical SEO auditDeep crawl analysis, prioritize fixesGeneric tool dump, no prioritization
Crawl budgetOptimize sitemap, robots.txt, fix redirectsOver-blocking or ignoring crawl stats
Core Web VitalsDiagnose LCP, CLS, INP; custom fixesPlugin-only fixes, ignoring real user data
Duplicate contentCanonical tags, 301 redirects, pagination handlingMisused canonical tags, ignoring URL parameters
On-page & keywordsIntent mapping, content strategy, schemaKeyword stuffing, exact-match obsession
Link buildingQuality outreach, disavow toxic linksGuaranteed link counts, PBNs, paid links
ReportingTransparent metrics, traffic-to-revenue linkVanity metrics, no conversion tracking

Your SEO agency should be your partner in technical health, not a vendor selling shortcuts. Use this checklist to hold them accountable—and if they can’t explain why they’re doing something, it’s time to ask more questions.

Wendy Garza

Wendy Garza

Technical SEO Specialist

Elena focuses on site architecture, crawl efficiency, and structured data. She breaks down complex technical issues into clear, actionable steps.

Reader Comments (0)

Leave a comment