The SEO Agency Checklist: How to Vet Technical Audits, Content Strategy & Site Performance
You’ve just signed a contract with an SEO agency, or you’re about to. The pitch deck looked slick—promises of “increased organic visibility,” “data-driven strategies,” and “Core Web Vitals optimization.” But here’s the thing: the difference between an agency that moves the needle and one that just moves your budget often comes down to how well you brief them on technical SEO and site health. Without a clear checklist, you risk paying for generic reports that don’t address your site’s actual crawl bottlenecks, duplicate content issues, or pagination problems.
This guide walks you through what to demand from your agency, how to evaluate their technical audit, and how to brief a content strategy that aligns with search intent—without falling for black-hat shortcuts. Treat this as your negotiation template, not a passive read.
Step 1: Demand a Crawl Budget & Indexation Audit First
Before any keyword research or link building begins, your agency must prove they understand how Googlebot interacts with your site. A proper technical SEO audit starts with two files: `robots.txt` and your XML sitemap. If your agency skips this, they’re guessing.
- Robots.txt check: Ask them to show you the current `robots.txt` file. Is it blocking critical resources like CSS, JavaScript, or images? A common mistake is accidentally disallowing entire sections of your site (e.g., `/blog/` or `/products/`). They should also confirm that your sitemap URL is referenced in the file.
- XML sitemap audit: Your sitemap should only include canonical, indexable URLs. If it’s bloated with paginated pages, parameter-heavy URLs, or thin content, you’re wasting crawl budget. The agency should flag any sitemaps that exceed 50,000 URLs or are larger than 50MB (uncompressed).
- Crawl budget analysis: For large sites (10,000+ pages), the agency must estimate how many pages Google crawls daily versus how many you want indexed. If your crawl budget is too low, your new content won’t get discovered quickly. They should recommend consolidating low-value pages, improving internal linking, and reducing redirect chains.
Step 2: Verify Their Approach to Duplicate Content & Canonicalization
Duplicate content can dilute SEO performance. It can reduce link equity, confuse search engines about which version to rank, and may trigger algorithmic filters. Your agency’s technical audit should include a thorough check for:
- Missing or incorrect canonical tags: Every page should have a self-referencing canonical tag unless it’s a syndicated or parameterized version. The agency must check if `rel="canonical"` points to the correct URL, not a 301 redirect or a non-indexable page.
- Pagination issues: If you have paginated category pages (e.g., `/category/page/2/`), the agency should implement `rel="next"` and `rel="prev"` tags, or use a “View All” page with a canonical pointing to it. Avoid infinite scroll without proper history API integration—otherwise, search engines may only index the first page.
- HTTP vs. HTTPS and www vs. non-www: This sounds basic, but many audits miss mixed content warnings. The agency should verify that all internal links use the same protocol and subdomain. They should also check for duplicate content caused by trailing slashes or URL case sensitivity.

| Trigger | Agency Action | Risk if Ignored |
|---|---|---|
| URL parameters (e.g., `?sort=price`) | Set parameter handling in Google Search Console or add canonical tags | Crawl budget waste, diluted rankings |
| Session IDs in URLs | Block via robots.txt or use cookies instead | Index bloat, duplicate penalties |
| Printer-friendly pages | Add `noindex` or canonical to original page | Thin content competing with main page |
| Syndicated content | Use `rel="canonical"` pointing to original source | Duplicate content filter, lost link equity |
For a deeper dive, read our analysis on /duplicate-content-issues and /canonical-urls.
Step 3: Insist on Core Web Vitals & Site Performance Metrics
Core Web Vitals (LCP, CLS, FID/INP) are considered ranking signals by Google. If your agency isn’t measuring and optimizing these, they’re operating with a blind spot. But here’s the nuance: you need to distinguish between lab data (from Lighthouse) and field data (from Chrome User Experience Report). A good agency will use both.
- LCP (Largest Contentful Paint): Should be under 2.5 seconds. The agency should identify the largest element (often a hero image or text block) and recommend lazy loading for below-fold images, preloading critical assets, and optimizing server response times.
- CLS (Cumulative Layout Shift): Should be under 0.1. This is often caused by ads, embeds, or images without explicit dimensions. The agency must audit third-party scripts and suggest using `aspect-ratio` in CSS or reserving space for dynamic content.
- INP (Interaction to Next Paint): Replaces FID. Under 200 milliseconds is generally considered good based on Google’s guidance. The agency should profile JavaScript execution, especially on pages with heavy interactivity (e.g., product filters, accordion content).
Risk alert: Poor Core Web Vitals may affect your site’s eligibility for Google Discover and Top Stories. If your agency ignores this, you’re leaving traffic on the table. Also, avoid agencies that promise “instant Core Web Vitals fixes” by removing all JavaScript—this often breaks functionality and user experience.
Step 4: Evaluate Their On-Page Optimization & Content Strategy
On-page optimization goes beyond stuffing keywords into title tags. Your agency should demonstrate an understanding of search intent mapping—matching content to what users actually want at each stage of the funnel. Here’s how to brief them effectively:
- Keyword research with intent layers: Ask for a spreadsheet that categorizes keywords into informational, navigational, commercial, and transactional. For each target keyword, the agency should specify the recommended content type (blog post, product page, guide, video).
- Content gap analysis: They should compare your existing content against competitors’ top-ranking pages for your target terms. If you’re missing a pillar page or a FAQ section, they need to flag it.
- Internal linking strategy: The agency must show how they plan to distribute link equity from high-authority pages to newer or deeper pages. This includes anchor text optimization (avoiding over-optimization) and ensuring every important page has at least three internal links.

Step 5: Brief a Link Building Campaign with Risk Awareness
Link building is a volatile part of SEO. A single bad link profile can trigger a manual penalty or algorithmic devaluation. Your agency’s approach should be transparent, scalable, and risk-aware. Use this checklist when briefing them:
- Backlink profile audit first: They must analyze your current backlink profile using tools like Ahrefs or Majestic. Look for spammy domains, irrelevant anchor text, and toxic links. The agency should recommend disavowing or removing harmful links before starting new outreach.
- Define link quality metrics: Ask them to specify target domains based on Domain Authority (DA) or Trust Flow (TF), but also relevance. A link from a high-DA pet blog may not be as valuable for a B2B SaaS site as a more relevant source. They should also check for editorial links, not just directory submissions or paid placements.
- Outreach strategy: The agency should outline their process for guest posting, broken link building, or resource page linking. They should avoid link farms, private blog networks (PBNs), or automated link exchanges. If they mention “bulk backlink packages” or “guaranteed links from .edu domains,” run.
- Monitoring & reporting: They should provide monthly reports showing new links gained, lost links, and changes in referral traffic. They should also track anchor text diversity—over-optimized anchor text (e.g., 80% exact-match) is a penalty risk.
| Approach | Risk Level | Typical Cost | Notes |
|---|---|---|---|
| Guest posting on relevant blogs | Low | Time-intensive | Requires quality content and outreach |
| Broken link building | Low | Moderate | Works best for resource-heavy niches |
| Skyscraper technique | Low | High | Requires creating superior content |
| Directory submissions | Medium | Low | Only if directories are curated |
| PBNs (private blog networks) | High | Variable | Risk of manual penalty |
| Paid links (non-disclosed) | High | High | Violates Google’s guidelines |
Step 6: Demand Transparent Reporting & Ongoing Maintenance
SEO is not a set-it-and-forget-it service. Your agency should provide regular reports that go beyond vanity metrics (like “organic sessions up 20%”). They need to show you the health of your technical foundation. Here’s what to include in your reporting brief:
- Weekly crawl error logs: From Google Search Console, highlighting 404 errors, soft 404s, and server errors. The agency should address these promptly.
- Indexation status: How many pages are indexed versus submitted? If the gap widens, there’s a crawl or quality issue.
- Core Web Vitals trend: A line chart showing LCP, CLS, and INP over time. If scores degrade after a site update, the agency should alert you.
- Backlink profile changes: New links, lost links, and any toxic link alerts. They should also report on anchor text distribution.
- Keyword rankings: For your target terms, with movement tracking. But don’t obsess over daily fluctuations—focus on monthly trends.
Final Checklist: What to Demand from Your SEO Agency
Before you sign off on any campaign, use this condensed checklist:
- Technical audit covering robots.txt, XML sitemap, and crawl budget
- Duplicate content analysis with canonical tag recommendations
- Core Web Vitals baseline scores and optimization plan
- On-page optimization with intent mapping and content gap analysis
- Link building strategy with risk assessment and backlink profile audit
- Transparent reporting with technical health metrics and trend data
- No guarantees of “page 1 rankings” or “instant results”
- No use of black-hat techniques like PBNs or paid links

Reader Comments (0)