The Expert SEO Agency Checklist: What a Real Technical Audit Should Cover
You’ve hired an SEO agency, or you’re about to. The pitch deck looked polished, the case studies were impressive, and the account manager used all the right buzzwords—crawl budget, Core Web Vitals, canonical tags. But here’s the uncomfortable truth: not every agency delivers the same depth of technical work. Some will run a quick Screaming Frog crawl, print a PDF with a few red flags, and call it a day. Others will dig into server logs, analyze rendering behavior, and map every redirect chain back to your business logic. The difference isn’t just in tools—it’s in process. This checklist is designed to help you evaluate whether your agency is doing the real work, or just the visible work. We’ll walk through what a proper technical SEO audit should include, how to brief a link building campaign without falling for black-hat promises, and where most site health reviews quietly fail.
1. The Pre-Audit: Crawl Budget and Site Architecture
Before any optimization begins, your agency should understand how search engines actually interact with your website. This starts with crawl budget—the number of URLs Googlebot will crawl on your site within a given timeframe. For small blogs, this is rarely a concern. For large e-commerce sites, news platforms, or enterprise portals with thousands of product pages, crawl budget management can make or break your indexation coverage. A competent agency will ask for your server logs, not just your sitemap. They want to see which pages Google is actually hitting, how often, and with what response codes. If they’re only looking at your XML sitemap and robots.txt, they’re missing half the picture.
Here’s what the initial audit should cover:
- Crawlability check: Review robots.txt for accidental blocking of critical resources (CSS, JS, images). Ensure no “Disallow: /” is present on live pages unless intentional.
- Crawl budget analysis: Identify wasted crawl on thin pages, infinite parameter loops, or redirect chains. Prioritize high-value pages for frequent crawling.
- Site architecture review: Evaluate internal linking depth. Key pages should be reachable within 3 clicks from the homepage. Flat architecture reduces crawl depth and spreads link equity.
- XML sitemap hygiene: Confirm sitemaps only include indexable, canonical pages. Exclude paginated parameters, session IDs, and noindex URLs. Submit sitemap to Google Search Console.
2. On-Page Optimization: Beyond Title Tags and Meta Descriptions
On-page SEO is often reduced to keyword stuffing in headers and meta fields. That’s outdated. Modern on-page optimization involves semantic relevance, structured data, and user intent alignment. Your agency should be mapping keyword research to actual search intent—not just volume and difficulty scores. For example, a keyword like “best running shoes for flat feet” implies comparison and purchase intent, while “how to fix flat feet” implies informational intent. Serving the wrong content type (a product page for an informational query) will hurt performance regardless of how many times you repeat the keyword.
A thorough on-page audit should include:
| Element | What to Check | Common Issue |
|---|---|---|
| Title tags | Unique, descriptive, within 60 characters | Duplicate or missing titles |
| Meta descriptions | Compelling, includes primary keyword, under 160 characters | Auto-generated or generic text |
| Heading hierarchy | H1 unique per page, H2-H6 logical flow | Multiple H1s or skipped levels |
| Image alt text | Descriptive, keyword-relevant, not stuffed | Missing alt or irrelevant keywords |
| Internal links | Contextually relevant, natural anchor text | Broken links or over-optimized anchors |
| Structured data | Schema markup for type (Article, Product, FAQ, etc.) | Missing or incorrect schema |
| Content quality | Original, comprehensive, answers user intent | Thin content or keyword stuffing |
The real test is whether the agency provides a content gap analysis. If they only optimize existing pages without identifying missing topics, you’re leaving traffic on the table. A good content strategy will include a cluster model: a pillar page that covers a broad topic, supported by cluster pages targeting specific subtopics. This signals topical authority to search engines and improves rankings for related queries.

3. Core Web Vitals and Site Performance: The Non-Negotiable
Google’s Core Web Vitals—Largest Contentful Paint (LCP), First Input Delay (FID), and Cumulative Layout Shift (CLS)—are now ranking signals. But more importantly, they directly affect user experience. A slow site with layout shifts will bleed visitors, regardless of where you rank. Your agency should be measuring these metrics in real user conditions (via Chrome User Experience Report), not just lab tests. Lab data from Lighthouse is useful for debugging, but field data tells you what real users experience.
Action steps for performance optimization:
- LCP optimization: Ensure the largest visible element (usually a hero image or text block) loads quickly. Compress images, use next-gen formats (WebP, AVIF), implement lazy loading for below-fold content, and consider a CDN.
- FID/INP improvement: Minimize JavaScript blocking. Defer non-critical scripts, inline critical CSS, and reduce third-party script impact. For interaction-heavy pages, aim for an Interaction to Next Paint (INP) under 200ms.
- CLS prevention: Set explicit width and height attributes on images and embeds. Reserve space for ads or dynamic content. Avoid inserting content above existing content after load.
4. Link Building: How to Brief a Campaign Without Falling for Black-Hat Traps
Link building is the most risk-prone area of SEO. Black-hat tactics—private blog networks (PBNs), paid links, automated outreach—can trigger manual penalties and tank your rankings. A reputable agency will never guarantee specific Domain Authority (DA) increases or promise a fixed number of links per month without explaining the methodology. Instead, they should present a transparent backlink profile strategy based on relevance, authority, and natural acquisition.
When briefing a link building campaign, include these requirements:
- Relevance over volume: Links from sites in your industry or adjacent niches carry more weight than generic directories. Ask for examples of target domains before the campaign starts.
- Content-driven outreach: The best links come from creating genuinely useful content—original research, case studies, infographics, or expert guides. Your agency should propose content ideas, not just link requests.
- Diversity of sources: A healthy Trust Flow profile includes editorial links, guest posts, resource page mentions, and unlinked brand mentions. Avoid over-reliance on one type.
- Disavow readiness: Even legitimate campaigns can attract toxic links. Your agency should monitor the backlink profile monthly and prepare a disavow file if needed. This is not admission of guilt—it’s proactive maintenance.
5. Duplicate Content and Canonicalization: The Silent Ranking Killer
Duplicate content isn’t a penalty—it’s a filter. When Google encounters identical or near-identical content across multiple URLs, it selects one version to index and show in results. If it picks the wrong one, your traffic suffers. This is especially common in e-commerce sites with product variations (color, size, location) or sites with session IDs and tracking parameters.

Your agency should audit for:
- Canonical tag implementation: Every page should have a self-referencing canonical tag unless it’s a duplicate. Ensure canonicals point to the correct preferred URL (e.g., HTTPS over HTTP, www vs non-www).
- Parameter handling: Use Google Search Console’s URL Parameters tool to tell Google how to treat tracking parameters. Better yet, handle them via JavaScript or server-side logic to avoid duplicate URLs entirely.
- Pagination and infinite scroll: Use rel=“next” and rel=“prev” for paginated series, or implement view-all pages for small sets. Infinite scroll without proper URL management can create thousands of near-duplicate pages.
- HTTP vs HTTPS and www vs non-www: Redirect all variants to a single canonical version. Use 301 redirects, not 302, for permanent moves.
6. Analytics and Reporting: What to Expect from a Real SEO Agency
The final piece is measurement. Without proper analytics and reporting, you can’t know if the agency’s work is driving results. A good report goes beyond ranking positions and traffic volume. It should tie SEO activities to business outcomes—leads, sales, or conversions.
What a useful monthly report includes:
- Organic traffic trends: Compare month-over-month and year-over-year, segmented by landing page and device.
- Keyword movement: Track priority keywords for visibility, not just rank position. Visibility score (impressions * CTR) is more meaningful than raw rank.
- Technical health score: Based on crawl errors, indexation status, Core Web Vitals pass rate, and manual action warnings.
- Backlink profile changes: New links acquired, lost links, toxicity score, and domain diversity.
- Conversion attribution: Use goals or e-commerce tracking to show how organic traffic converts. This may require integration with your CRM or analytics platform.
Final Checklist: How to Evaluate Your Agency’s Technical SEO
Use this checklist when reviewing your agency’s deliverables. If they’re missing three or more items, it’s time for a conversation.
- Server log analysis for crawl budget assessment
- robots.txt and XML sitemap audit with recommendations
- Internal linking structure review (depth, anchor text, orphan pages)
- On-page optimization for title tags, meta descriptions, headers, and structured data
- Core Web Vitals field data analysis and optimization plan
- Duplicate content audit with canonical tag fixes
- Link building strategy with transparent outreach methodology
- Monthly reporting with traffic, keyword visibility, technical health, and conversion data
- Risk monitoring: manual action warnings, toxic backlinks, algorithm updates

Reader Comments (0)