How to Vet a Technical SEO Agency: A Practical Checklist for Site Health & Performance
You’ve just received the crawl report from your SEO agency, and it’s 47 pages long—full of red flags about duplicate content, slow Core Web Vitals, and a crawl budget that seems to be evaporating. Your first instinct might be to panic, but here’s the reality: a technical SEO audit is only as valuable as the agency’s ability to prioritize and execute fixes. The difference between a site that climbs rankings and one that stalls often comes down to how well you brief the agency on what matters—and how well you hold them accountable. This guide walks you through the essential steps to evaluate an SEO agency’s technical services, from understanding crawl mechanics to avoiding common traps like black-hat link building.
1. Start with the Technical SEO Audit: What You Should Expect
A thorough technical SEO audit is the foundation of any site health campaign. A competent agency will not just run a tool like Screaming Frog or Sitebulb; they will interpret the data within the context of your site’s architecture and business goals. The audit should cover, at minimum, crawlability, indexation, site speed, mobile usability, and structured data.
When you receive the audit, look for these specific deliverables:
- Crawl budget analysis: Is the agency checking how Googlebot allocates resources across your site? For large e-commerce sites with thousands of product pages, inefficient crawl budget can leave important pages unindexed.
- Core Web Vitals breakdown: Are LCP, CLS, and FID (or INP) being measured against real-user data from Chrome User Experience Report, not just lab simulations?
- XML sitemap and robots.txt review: Are there orphan pages? Is the sitemap bloated with low-value URLs? Is the robots.txt accidentally blocking critical sections?
- Canonical tag audit: Are canonical tags pointing to the right versions of pages, or are they creating a web of conflicting signals?
2. On-Page Optimization: Beyond Keyword Stuffing
On-page optimization has evolved far beyond stuffing target keywords into H1 tags and meta descriptions. A modern agency should be mapping content to search intent—what the user actually wants when they type a query. This is where keyword research meets intent mapping.
For example, a query like “best running shoes for flat feet” indicates commercial investigation, while “how to choose running shoes” is informational. The agency’s content strategy should tailor page structure, internal linking, and multimedia (videos, infographics) to match that intent.
Here’s a table comparing old-school on-page tactics with modern best practices:

| Old Approach | Modern Approach |
|---|---|
| Exact-match keyword in title tag | Topic clusters with semantic variations |
| Meta keyword tag (deprecated) | Structured data (schema markup) for rich snippets |
| High keyword density (3-5%) | Natural language with related entities |
| Separate mobile/desktop pages | Responsive design with mobile-first indexing |
| Single focus keyword per page | Primary + secondary keywords aligned to user journey |
When briefing an agency, ask for a sample content brief that includes intent mapping, competitor gap analysis, and internal linking recommendations. If they can’t explain why a certain keyword should target a blog post versus a product page, they’re probably still operating in 2015.
3. Performance Optimization: Core Web Vitals and Site Speed
Core Web Vitals are considered a ranking signal among many factors, but many agencies treat them as a checkbox exercise. A good agency will dig into the underlying causes: oversized images, render-blocking JavaScript, inefficient CSS, or slow server response times. The goal is not just to pass a Google PageSpeed Insights test but to deliver a fast, stable user experience across devices and network conditions.
What to look for in an agency’s performance report:
- LCP (Largest Contentful Paint): Is it under a good threshold? Are hero images being lazy-loaded?
- CLS (Cumulative Layout Shift): Are font swaps or dynamic ads causing layout shifts?
- FID/INP (First Input Delay / Interaction to Next Paint): Is JavaScript execution blocking user interactions?
4. Link Building: The Risky Business of Backlink Profiles
Link building remains one of the most effective off-page SEO tactics, but it’s also the most dangerous. Black-hat techniques—like buying links from private blog networks (PBNs), using automated link exchanges, or spamming forum comments—can increase the risk of manual penalties from Google. A reputable agency will focus on earning links through content outreach, digital PR, and relationship building.
When evaluating an agency’s link building approach, ask for:
- A sample outreach email: Is it personalized or a template blast?
- Their disavow process: Do they regularly audit the backlink profile for toxic links?
- Metrics they track: Are they fixated on Domain Authority (DA) or Trust Flow (TF) alone, or do they consider relevance and traffic potential?

| Approach | Risk Level | Typical Outcome |
|---|---|---|
| Guest posting on authority sites | Low | Steady DA growth, referral traffic |
| Broken link building | Low | High relevance, time-intensive |
| PBN links | High | Short-term ranking spikes, possible penalty risk |
| Sponsored content (no disclosure) | Medium | Possible Google action if detected |
| HARO/connectively outreach | Low | High-quality editorial links |
A good agency will also explain that some low-quality links are inevitable—spam bots or scrapers may link to your site without your knowledge. The key is the ratio of high-quality to low-quality links and the agency’s ability to identify and disavow the toxic ones.
5. Crawl Budget and Site Architecture: The Hidden Lever
Crawl budget is often overlooked by less experienced agencies. For small sites (under 10,000 pages), it’s rarely an issue. But for large e-commerce platforms, news sites, or content-heavy blogs, inefficient crawl budget can mean that Google spends time on low-value pages (e.g., filter parameters, pagination archives) while ignoring your most important product or article pages.
An agency should analyze your server logs (not just crawl data) to see how Googlebot actually moves through your site. They should then recommend:
- Consolidating thin pages (e.g., merging similar product variations).
- Using `noindex` tags or canonical tags on paginated URLs.
- Optimizing the XML sitemap to prioritize high-value content.
- Reducing crawl depth—critical pages should be within 3 clicks of the homepage.
6. The Onboarding Brief: What You Need to Provide
To get the most out of an SEO agency, you need to brief them effectively. A good brief saves time, reduces friction, and ensures the agency focuses on what actually drives business value. Here’s a checklist for your initial briefing:
- Business goals: Are you aiming for leads, e-commerce sales, or brand awareness?
- Target audience: Demographics, pain points, and search behavior.
- Current site issues: Known technical problems, past penalties, or recent migrations.
- Competitor landscape: Who are your top 3 competitors, and what are they doing well?
- KPIs: Organic traffic, keyword rankings, conversion rate, or something else?
- Budget and timeline: Realistic expectations for how long SEO takes (often 3-6 months, but can vary).
7. Risk-Aware Content: What Can Go Wrong
Even with a reputable agency, things can go wrong. Here are common pitfalls and how to avoid them:
- Wrong redirects: Using 302 (temporary) redirects instead of 301 (permanent) can sometimes dilute link equity depending on implementation. An agency should audit redirect chains and fix any loops.
- Poor Core Web Vitals optimization: Over-optimizing for speed can break functionality. Test changes in a staging environment first.
- Black-hat links: If you suspect an agency is using PBNs, ask for a list of all links built and check their domain authority and relevance.
- Duplicate content from syndication: If you republish content from other sources, ensure canonical tags point to the original.
Summary Checklist for Hiring or Evaluating an SEO Agency
- Does the agency provide a prioritized technical audit with crawl budget analysis?
- Are Core Web Vitals measured using real-user data (CrUX), not just lab tests?
- Does their on-page strategy include intent mapping and structured data?
- Is their link building approach transparent and focused on earning links, not buying them?
- Do they analyze server logs for crawl behavior?
- Can they explain the risks of each optimization (e.g., image compression trade-offs)?
- Do they provide a clear brief template for onboarding?
- Are they willing to share examples of past audits (anonymized)?

Reader Comments (0)