How to Vet an SEO Agency for Technical Site Health and Core Web Vitals

How to Vet an SEO Agency for Technical Site Health and Core Web Vitals

You’ve decided to hire an SEO agency. Maybe your organic traffic has plateaued, or your Core Web Vitals scores just dropped in Search Console, or you’re simply tired of guessing why pages aren’t ranking. But here’s the hard truth: not every agency that claims to do “technical SEO” actually understands how Googlebot interacts with your server configuration, your JavaScript rendering pipeline, or your crawl budget allocation. The wrong partner can leave you with issues like broken redirects, bloated pages, or a problematic backlink profile.

This checklist walks you through the specific questions, red flags, and deliverables you need to evaluate before signing a contract. We’ll cover technical audits, Core Web Vitals remediation, content strategy alignment, and link building risk management—all with an eye toward what can go wrong if shortcuts are taken.

Step 1: Demand a Transparent Technical SEO Audit, Not a Surface-Level Scan

A proper technical SEO audit goes far beyond running a crawler and producing a PDF with 47 “errors.” You need an agency that can explain why a particular issue matters for your specific site architecture, traffic patterns, and business goals.

What a thorough audit should include:

  • Crawl budget analysis: how many pages Googlebot actually fetches per day, which pages are being ignored, and why.
  • Indexation coverage: pages indexed vs. pages submitted in your XML sitemap, with a breakdown of excluded URLs and the reasons (noindex, canonicalized, blocked by robots.txt, 404, soft 404).
  • JavaScript rendering assessment: whether Google can see and index content that loads dynamically, and whether your server returns meaningful HTML before JavaScript executes.
  • Duplicate content identification: not just exact duplicates, but near-duplicates caused by URL parameters, session IDs, or pagination.
  • Core Web Vitals baseline: LCP, CLS, and INP (or FID) data for your top landing pages, segmented by device and connection type.
Risk to watch for: Some agencies may apply canonical tags without fully analyzing which version users and search engines actually prefer. This can lead to indexing issues. Insist on seeing the logic behind each canonicalization decision.

Table: Audit Deliverables Comparison

DeliverableBasic AgencyAdvanced Agency
Crawl reportList of 404s and redirectsFull coverage analysis with crawl budget breakdown
Core Web VitalsLab data from PageSpeed InsightsField data from CrUX report, segmented by URL and device
Duplicate contentCount of duplicate titlesCanonicalization map with rationale for each decision
JavaScript rendering“Site uses JS” warningWaterfall chart of JS execution and its impact on LCP
Action planGeneric “fix errors” listPrioritized tasks by estimated traffic impact and effort

Step 2: Validate Their Approach to Core Web Vitals and Site Performance

Core Web Vitals are not a one-time fix. They require ongoing monitoring because changes to your CMS, third-party scripts, or CDN configuration can regress scores overnight. An agency that promises to “pass Core Web Vitals in two weeks” may be oversimplifying the problem or planning to use aggressive tactics like lazy-loading everything, which can hurt user experience in the long run.

What to look for in their methodology:

  • They measure field data (from Chrome User Experience Report) alongside lab data. Field data reflects real user conditions, while lab data is synthetic.
  • They understand the difference between optimizing LCP (largest contentful paint) by reducing server response time vs. compressing images vs. preloading critical resources. Each approach has trade-offs.
  • They can explain how CLS (cumulative layout shift) is caused not just by images without dimensions, but by dynamically injected ads, embeds, or web fonts that load late.
  • They have a process for identifying and deferring or removing render-blocking third-party scripts, with a fallback plan if the script is essential for business operations (e.g., analytics, tracking pixels).
What can go wrong: Aggressive script deferral can potentially break analytics tracking or ad serving. Poorly implemented lazy loading can push LCP even higher because the main image is loaded after user interaction. An agency that doesn’t test across different devices and connection speeds may “fix” Core Web Vitals for a desktop on fiber but make things worse for a mobile user on 3G.

Step 3: Scrutinize Their Content Strategy and Keyword Research Process

Technical SEO and content strategy are not separate silos. If your site has great technical health but pages target the wrong keywords or fail to match search intent, you won’t rank. Conversely, if you have excellent content that is buried by poor crawlability or slow load times, no one will see it.

How to evaluate their process:

  • Keyword research should include intent mapping: separating informational, navigational, commercial, and transactional queries. An agency that throws a list of high-volume keywords at you without explaining intent is likely to produce content that doesn’t convert.
  • They should analyze your existing content inventory to identify gaps, overlaps, and opportunities for consolidation (e.g., merging thin pages into a single comprehensive guide to avoid duplicate content issues).
  • Content strategy must align with technical constraints. For example, if your site architecture limits the number of top-level categories, the agency should work within that structure rather than proposing a flat hierarchy that confuses users and crawlers.
Red flag: An agency that promises to “target many keywords per month” without discussing content quality, internal linking, or the resources required to produce and maintain that volume. This often leads to keyword stuffing or thin content that search engines devalue.

Step 4: Insist on a Risk-Aware Link Building Plan

Link building is where many SEO agencies cut corners. Tactics like private blog networks (PBNs), paid links, or automated outreach can produce short-term gains but leave your site vulnerable to manual penalties that take months to recover from.

What a responsible link building plan looks like:

  • Backlink profile analysis first: they should audit your existing links for toxic signals (spammy anchor text, low-trust domains, sites with no editorial oversight) and disavow only the genuinely harmful ones, not every link from a low-DA domain.
  • Outreach targets should be based on topical relevance and editorial merit, not just Domain Authority or Trust Flow. A link from a relevant industry blog with modest authority is often more valuable than a link from a high-DA directory that has no contextual connection to your content.
  • They should provide a clear process for content-based link acquisition: creating genuinely useful resources (guides, data studies, tools) that publishers want to link to naturally.
  • They should be transparent about the ratio of dofollow to nofollow links and explain why a natural profile includes both.
What can go wrong: Over-disavowing links from low-DA but relevant sites can reduce your link profile’s diversity. Building links too quickly from unrelated sites can potentially trigger Google’s spam algorithms. An agency that guarantees a specific number of backlinks per month may be using questionable methods.

Table: Link Building Approaches Comparison

ApproachRisk LevelTypical OutcomeRecovery Time If Penalized
Guest posting on relevant sitesLowGradual authority gainN/A (if done ethically)
Broken link buildingLowModerate link volumeN/A
Content-based outreach (original research)LowHigh-quality links, long-termN/A
PBN linksVery highShort-term spike, then penalty6–12 months after cleanup
Paid links (undisclosed)HighQuick boost, then manual action3–6 months after disavow
Automated directory submissionsMediumLow-value links, possible algorithmic penalty1–3 months after removal

Step 5: Verify Their On-Page Optimization Methodology

On-page optimization is more than stuffing keywords into title tags and meta descriptions. It involves structuring content so that search engines can understand the hierarchy and relationships between sections, while also delivering a good user experience.

Key areas to evaluate:

  • Title tags and headings: They should be unique, descriptive, and include the target keyword naturally. Avoid keyword stuffing or using the same title across multiple pages.
  • Internal linking: The agency should have a plan for linking related content in a way that distributes link equity and helps users navigate. This includes using descriptive anchor text (not “click here”) and ensuring that important pages are linked from the homepage or main navigation.
  • Image optimization: Alt text should describe the image content and include relevant keywords where natural. File names should be descriptive, and images should be compressed without noticeable quality loss.
  • Schema markup: They should identify opportunities for structured data (e.g., FAQ, HowTo, Article, Product) that can enhance search result appearance, but only where the markup accurately represents the page content.
Risk to watch for: Over-optimization, such as using exact-match keywords in every heading, can make content read unnaturally and may potentially trigger algorithmic penalties. Similarly, adding schema markup for the sake of it (e.g., marking a blog post as a “Product” to get rich results) can lead to manual action.

Step 6: Confirm Their Monitoring and Reporting Cadence

SEO is not a set-it-and-forget-it service. Algorithm updates, competitor actions, and changes to your own site (e.g., a new CMS version, a redesign, or added tracking scripts) can all affect performance. You need an agency that monitors continuously and reports transparently.

What to expect in reporting:

  • Monthly or bi-weekly reports that include organic traffic trends, keyword rankings (by position and volume), Core Web Vitals scores, crawl errors, and backlink profile changes.
  • Alerts for significant drops in traffic, rankings, or Core Web Vitals scores, with a preliminary diagnosis of the likely cause.
  • Actionable insights, not just data dumps. For example, instead of saying “traffic decreased 10%,” they should explain that the drop was concentrated in three blog posts that lost rankings after a Google update, and that they plan to update those posts with fresh content and improved internal linking.
Red flag: An agency that only reports on vanity metrics like “total backlinks” or “keyword volume” without linking them to business outcomes (leads, sales, or conversions). Also be wary of agencies that ask for a long-term contract (e.g., 12 months) with no performance-based exit clause.

Step 7: Have a Clear Offboarding Plan

You may not need it now, but you should know how the relationship ends. A good agency will ensure that you own all the work they’ve done: the audit reports, the content calendar, the backlink outreach templates, the technical fixes implemented. They should also provide documentation on how to maintain the improvements after they leave.

Questions to ask:

  • Who owns the data (Search Console, Analytics, etc.)? It should be your account, not theirs.
  • What happens to the content they produce? You should have full rights to republish or modify it.
  • Will they provide a transition document outlining current performance, pending tasks, and ongoing maintenance activities?

Summary: Your Pre-Engagement Checklist

Before signing any contract, run through this checklist with the agency:

  • They provide a sample technical audit report that includes crawl budget analysis, JavaScript rendering assessment, and Core Web Vitals field data.
  • They explain how they handle duplicate content without relying solely on canonical tags.
  • They have a documented process for Core Web Vitals optimization that includes both lab and field data, and they acknowledge the risks of aggressive script deferral.
  • Their keyword research includes intent mapping and aligns with your existing content inventory.
  • Their link building plan avoids black-hat tactics and includes a backlink profile audit first.
  • They provide transparent reporting with actionable insights, not just vanity metrics.
  • They agree that you own all data and deliverables, and they provide a transition plan for offboarding.
Choosing the right technical SEO partner is an investment in your site’s long-term health. The agency that asks the most questions about your business, your users, and your infrastructure is likely the one that will deliver sustainable results.

Wendy Garza

Wendy Garza

Technical SEO Specialist

Elena focuses on site architecture, crawl efficiency, and structured data. She breaks down complex technical issues into clear, actionable steps.

Reader Comments (0)

Leave a comment