The Expert SEO Agency Checklist: What a Real Technical Audit Should Cover

The Expert SEO Agency Checklist: What a Real Technical Audit Should Cover

You’ve hired an SEO agency, or you’re about to. The pitch deck looked polished, the case studies were impressive, and the account manager used all the right buzzwords—crawl budget, Core Web Vitals, canonical tags. But here’s the uncomfortable truth: not every agency delivers the same depth of technical work. Some will run a quick Screaming Frog crawl, print a PDF with a few red flags, and call it a day. Others will dig into server logs, analyze rendering behavior, and map every redirect chain back to your business logic. The difference isn’t just in tools—it’s in process. This checklist is designed to help you evaluate whether your agency is doing the real work, or just the visible work. We’ll walk through what a proper technical SEO audit should include, how to brief a link building campaign without falling for black-hat promises, and where most site health reviews quietly fail.

1. The Pre-Audit: Crawl Budget and Site Architecture

Before any optimization begins, your agency should understand how search engines actually interact with your website. This starts with crawl budget—the number of URLs Googlebot will crawl on your site within a given timeframe. For small blogs, this is rarely a concern. For large e-commerce sites, news platforms, or enterprise portals with thousands of product pages, crawl budget management can make or break your indexation coverage. A competent agency will ask for your server logs, not just your sitemap. They want to see which pages Google is actually hitting, how often, and with what response codes. If they’re only looking at your XML sitemap and robots.txt, they’re missing half the picture.

Here’s what the initial audit should cover:

  • Crawlability check: Review robots.txt for accidental blocking of critical resources (CSS, JS, images). Ensure no “Disallow: /” is present on live pages unless intentional.
  • Crawl budget analysis: Identify wasted crawl on thin pages, infinite parameter loops, or redirect chains. Prioritize high-value pages for frequent crawling.
  • Site architecture review: Evaluate internal linking depth. Key pages should be reachable within 3 clicks from the homepage. Flat architecture reduces crawl depth and spreads link equity.
  • XML sitemap hygiene: Confirm sitemaps only include indexable, canonical pages. Exclude paginated parameters, session IDs, and noindex URLs. Submit sitemap to Google Search Console.
A common mistake agencies make is over-optimizing crawl budget without considering user experience. Blocking low-value URLs is smart, but if those URLs serve real user intent (like filtered product categories), you need a strategy—not just a block rule. The goal is efficient crawling, not minimal crawling.

2. On-Page Optimization: Beyond Title Tags and Meta Descriptions

On-page SEO is often reduced to keyword stuffing in headers and meta fields. That’s outdated. Modern on-page optimization involves semantic relevance, structured data, and user intent alignment. Your agency should be mapping keyword research to actual search intent—not just volume and difficulty scores. For example, a keyword like “best running shoes for flat feet” implies comparison and purchase intent, while “how to fix flat feet” implies informational intent. Serving the wrong content type (a product page for an informational query) will hurt performance regardless of how many times you repeat the keyword.

A thorough on-page audit should include:

ElementWhat to CheckCommon Issue
Title tagsUnique, descriptive, within 60 charactersDuplicate or missing titles
Meta descriptionsCompelling, includes primary keyword, under 160 charactersAuto-generated or generic text
Heading hierarchyH1 unique per page, H2-H6 logical flowMultiple H1s or skipped levels
Image alt textDescriptive, keyword-relevant, not stuffedMissing alt or irrelevant keywords
Internal linksContextually relevant, natural anchor textBroken links or over-optimized anchors
Structured dataSchema markup for type (Article, Product, FAQ, etc.)Missing or incorrect schema
Content qualityOriginal, comprehensive, answers user intentThin content or keyword stuffing

The real test is whether the agency provides a content gap analysis. If they only optimize existing pages without identifying missing topics, you’re leaving traffic on the table. A good content strategy will include a cluster model: a pillar page that covers a broad topic, supported by cluster pages targeting specific subtopics. This signals topical authority to search engines and improves rankings for related queries.

3. Core Web Vitals and Site Performance: The Non-Negotiable

Google’s Core Web Vitals—Largest Contentful Paint (LCP), First Input Delay (FID), and Cumulative Layout Shift (CLS)—are now ranking signals. But more importantly, they directly affect user experience. A slow site with layout shifts will bleed visitors, regardless of where you rank. Your agency should be measuring these metrics in real user conditions (via Chrome User Experience Report), not just lab tests. Lab data from Lighthouse is useful for debugging, but field data tells you what real users experience.

Action steps for performance optimization:

  • LCP optimization: Ensure the largest visible element (usually a hero image or text block) loads quickly. Compress images, use next-gen formats (WebP, AVIF), implement lazy loading for below-fold content, and consider a CDN.
  • FID/INP improvement: Minimize JavaScript blocking. Defer non-critical scripts, inline critical CSS, and reduce third-party script impact. For interaction-heavy pages, aim for an Interaction to Next Paint (INP) under 200ms.
  • CLS prevention: Set explicit width and height attributes on images and embeds. Reserve space for ads or dynamic content. Avoid inserting content above existing content after load.
A common red flag is an agency promising instant Core Web Vitals fixes without first auditing the root causes. If they suggest buying a new hosting package without analyzing your current stack, question their approach. Performance optimization is iterative—measure, fix, measure again.

4. Link Building: How to Brief a Campaign Without Falling for Black-Hat Traps

Link building is the most risk-prone area of SEO. Black-hat tactics—private blog networks (PBNs), paid links, automated outreach—can trigger manual penalties and tank your rankings. A reputable agency will never guarantee specific Domain Authority (DA) increases or promise a fixed number of links per month without explaining the methodology. Instead, they should present a transparent backlink profile strategy based on relevance, authority, and natural acquisition.

When briefing a link building campaign, include these requirements:

  • Relevance over volume: Links from sites in your industry or adjacent niches carry more weight than generic directories. Ask for examples of target domains before the campaign starts.
  • Content-driven outreach: The best links come from creating genuinely useful content—original research, case studies, infographics, or expert guides. Your agency should propose content ideas, not just link requests.
  • Diversity of sources: A healthy Trust Flow profile includes editorial links, guest posts, resource page mentions, and unlinked brand mentions. Avoid over-reliance on one type.
  • Disavow readiness: Even legitimate campaigns can attract toxic links. Your agency should monitor the backlink profile monthly and prepare a disavow file if needed. This is not admission of guilt—it’s proactive maintenance.
A dangerous phrase to watch for: “We have relationships with high-DA sites that can guarantee links.” Real editorial links are earned, not bought. If the agency can’t explain how they’ll earn them, they’re likely using gray-hat methods that could backfire.

5. Duplicate Content and Canonicalization: The Silent Ranking Killer

Duplicate content isn’t a penalty—it’s a filter. When Google encounters identical or near-identical content across multiple URLs, it selects one version to index and show in results. If it picks the wrong one, your traffic suffers. This is especially common in e-commerce sites with product variations (color, size, location) or sites with session IDs and tracking parameters.

Your agency should audit for:

  • Canonical tag implementation: Every page should have a self-referencing canonical tag unless it’s a duplicate. Ensure canonicals point to the correct preferred URL (e.g., HTTPS over HTTP, www vs non-www).
  • Parameter handling: Use Google Search Console’s URL Parameters tool to tell Google how to treat tracking parameters. Better yet, handle them via JavaScript or server-side logic to avoid duplicate URLs entirely.
  • Pagination and infinite scroll: Use rel=“next” and rel=“prev” for paginated series, or implement view-all pages for small sets. Infinite scroll without proper URL management can create thousands of near-duplicate pages.
  • HTTP vs HTTPS and www vs non-www: Redirect all variants to a single canonical version. Use 301 redirects, not 302, for permanent moves.
A thorough agency will also check for soft 404s—pages that return a 200 status code but show no meaningful content. These confuse search engines and waste crawl budget. If your site has a “no results” page for empty search queries, it should return a 404 or 410.

6. Analytics and Reporting: What to Expect from a Real SEO Agency

The final piece is measurement. Without proper analytics and reporting, you can’t know if the agency’s work is driving results. A good report goes beyond ranking positions and traffic volume. It should tie SEO activities to business outcomes—leads, sales, or conversions.

What a useful monthly report includes:

  • Organic traffic trends: Compare month-over-month and year-over-year, segmented by landing page and device.
  • Keyword movement: Track priority keywords for visibility, not just rank position. Visibility score (impressions * CTR) is more meaningful than raw rank.
  • Technical health score: Based on crawl errors, indexation status, Core Web Vitals pass rate, and manual action warnings.
  • Backlink profile changes: New links acquired, lost links, toxicity score, and domain diversity.
  • Conversion attribution: Use goals or e-commerce tracking to show how organic traffic converts. This may require integration with your CRM or analytics platform.
If the agency only sends a PDF with a few charts and no commentary, push back. You need actionable insights—what worked, what didn’t, and what the next steps are. A good agency will also flag risks early, like a sudden drop in crawl rate or a spike in 404 errors.

Final Checklist: How to Evaluate Your Agency’s Technical SEO

Use this checklist when reviewing your agency’s deliverables. If they’re missing three or more items, it’s time for a conversation.

  • Server log analysis for crawl budget assessment
  • robots.txt and XML sitemap audit with recommendations
  • Internal linking structure review (depth, anchor text, orphan pages)
  • On-page optimization for title tags, meta descriptions, headers, and structured data
  • Core Web Vitals field data analysis and optimization plan
  • Duplicate content audit with canonical tag fixes
  • Link building strategy with transparent outreach methodology
  • Monthly reporting with traffic, keyword visibility, technical health, and conversion data
  • Risk monitoring: manual action warnings, toxic backlinks, algorithm updates
Remember, technical SEO is not a one-time fix. It’s an ongoing process of monitoring, testing, and adapting. The best agencies treat it as a partnership, not a service. If you’re looking for a deeper dive into specific areas, check out our guides on site health audits and on-page optimization best practices. Your site deserves more than a checklist—it deserves a strategy.

Wendy Garza

Wendy Garza

Technical SEO Specialist

Elena focuses on site architecture, crawl efficiency, and structured data. She breaks down complex technical issues into clear, actionable steps.

Reader Comments (0)

Leave a comment