How to Vet an SEO Agency: A Technical Checklist for Site Performance & Growth

How to Vet an SEO Agency: A Technical Checklist for Site Performance & Growth

You’ve decided to invest in SEO services. Maybe your organic traffic has flatlined, your Core Web Vitals scores are flashing red, or you’ve just launched a new platform on Google Cloud Network Management and need to ensure search engines can actually find your pages. The problem isn’t finding an agency—it’s picking the right one. Every SEO firm promises “performance growth,” but the difference between a partner who delivers sustainable results and one who leaves you with a penalty lies in how they handle technical foundations. This checklist walks you through what to look for, what to avoid, and how to brief an agency so you get audits, on-page optimization, and link building that actually moves the needle.

1. Start with the Technical SEO Audit: What a Real One Looks Like

A credible agency will begin with a technical SEO audit—not a cursory scan, but a deep dive into crawlability, indexation, and site health. You need to understand what this audit covers before you sign anything.

What to expect in a thorough technical audit:

  • Crawl budget analysis: For large sites or those on platforms like Google Cloud, the audit should examine how search bots allocate resources. Are you wasting crawl budget on thin pages, duplicate content, or infinite parameter URLs? The agency should identify which pages matter most and ensure they’re prioritized.
  • robots.txt and XML sitemap review: The robots.txt file must not block critical resources (CSS, JS, images). The XML sitemap should list only canonical, indexable URLs. A good auditor checks for accidental blocking of important sections.
  • Canonical tag health: Duplicate content issues often stem from misconfigured canonical tags. The audit should flag pages where the rel=canonical points to the wrong URL or is missing entirely.
  • Core Web Vitals assessment: LCP, CLS, FID/INP—these metrics are ranking signals. The audit should measure real-user data (from CrUX) and lab data (from Lighthouse), then pinpoint causes like oversized images, render-blocking scripts, or slow server response times.
Red flags to watch for:
  • The agency offers a “free audit” that returns only a list of keywords you already rank for. That’s not a technical audit.
  • They claim they can “fix everything in a week.” Technical SEO is iterative; server-side changes, redirect mapping, and content deduplication take time.
  • They don’t mention crawl budget or canonicalization. These are foundational for any site with more than a few hundred pages.
How to brief the audit: Provide the agency with access to Google Search Console, Google Analytics, and your server logs. Ask for a prioritized action plan—not just a list of issues, but which fixes will have the biggest impact on crawl efficiency and indexation.

2. On-Page Optimization: Beyond Meta Tags

On-page optimization is where many agencies fall into the trap of “ticking boxes.” You need a partner who understands intent mapping and content strategy, not just keyword stuffing.

What a real on-page strategy includes:

  • Keyword research with intent mapping: The agency should categorize target terms by user intent—informational, navigational, commercial, transactional. A page optimized for “best cloud network tools” (commercial) requires a different structure and call-to-action than one targeting “how to configure Google Cloud network” (informational).
  • Content strategy that avoids duplication: Duplicate content isn’t just about identical text; it includes near-duplicate product descriptions, thin blog posts, and paginated category pages. The agency should recommend canonicalization, consolidation, or unique content creation.
  • Core Web Vitals integration: On-page changes (image compression, lazy loading, font optimization) must align with performance goals. If the agency suggests adding more widgets or heavy scripts without measuring impact on LCP, push back.
Table: On-Page vs. Off-Page Focus Areas

Focus AreaOn-Page OptimizationOff-Page Link Building
Primary goalImprove relevance & user experienceIncrease authority & trust
Key metricsBounce rate, time on page, Core Web VitalsDomain Authority, Trust Flow, referral traffic
Common risksKeyword cannibalization, thin contentBlack-hat links, unnatural link profiles
Typical timeline4–8 weeks for measurable impact3–6 months for sustainable growth

How to brief on-page work: Give the agency access to your content management system and a list of priority pages (e.g., high-traffic but low-converting, or new pages with zero organic visibility). Ask for a sample optimization of one page before committing to a full rollout.

3. Link Building: The Riskiest Part of SEO

Link building is where agencies either build sustainable authority or leave you with a penalty. The difference comes down to methodology.

What safe link building looks like:

  • Backlink profile analysis first: The agency should audit your existing backlinks for toxic or spammy links. If you have a history of low-quality links, they’ll need a disavow strategy before building new ones.
  • Outreach based on relevance: Links from sites that are topically related to your niche (e.g., cloud computing, network management) carry more weight than generic directories or press releases. The agency should target editorial links, guest posts on reputable industry blogs, and resource page inclusions.
  • Transparency on methods: They should explain their outreach process—how they identify prospects, what pitch they use, and how they track responses. Avoid agencies that refuse to share their link sources or use automated tools for mass outreach.
What black-hat looks like (and why to avoid it):
  • Private blog networks (PBNs) designed solely for link passing.
  • Paid links in footer widgets or spammy comments.
  • Automated directory submissions that violate Google’s guidelines.
Risk awareness: A poor link-building campaign can increase the risk of a manual penalty. Recovery, if a manual action is issued, often involves a formal reconsideration request. If an agency promises a large volume of high-authority links in a short time frame, it’s worth questioning their methods.

How to brief a link building campaign: Define your target audience—not just “tech sites,” but specific categories like cloud computing blogs, network security forums, or industry publications. Ask for a monthly report that includes the domain authority (DA) and Trust Flow (TF) of each acquired link, along with the outreach method used.

4. Crawl Budget and Site Architecture: The Hidden Lever

For large sites, crawl budget becomes a critical factor. Search engines allocate a finite number of requests per crawl session. If your site has thousands of low-value URLs (filtered product pages, session IDs, archived posts), bots may never reach your important pages.

What an agency should check:

  • Server log analysis: This reveals which pages Googlebot actually visits, how often, and which URLs return errors (4xx, 5xx). An agency that relies solely on Search Console data misses half the picture.
  • Internal linking structure: Deep pages with no internal links are often ignored by crawlers. The audit should recommend a logical hierarchy—important pages within 3 clicks of the homepage.
  • URL parameter handling: If your site uses filters or tracking parameters, the agency should configure Google Search Console to treat them as non-canonical, or use the `rel=nofollow` attribute on parameter links.
Common mistakes:
  • Blocking JavaScript or CSS in robots.txt, which prevents Google from rendering the page.
  • Using redirect chains (e.g., Page A → Page B → Page C), wasting crawl budget and diluting link equity.
  • Ignoring pagination—category pages with hundreds of products need proper rel=next/prev tags or infinite scroll with proper URL handling.
How to brief this: Provide the agency with a sitemap of your top 200 pages and ask them to map the crawl path. Request a recommendation for reducing crawl waste by at least 20% within the first month.

5. Core Web Vitals and Site Performance: A Non-Negotiable

Core Web Vitals are among the signals Google uses in its ranking systems, and many agencies treat them as an afterthought. Your agency should prioritize performance from day one.

What to demand:

  • Baseline measurement: Before any changes, the agency should record LCP, CLS, and INP for your top 20 pages using both lab (Lighthouse) and field (CrUX) data.
  • Actionable fixes: They should identify specific causes—for example, LCP delays due to a hero image not being preloaded, or CLS issues from ads without reserved space.
  • Ongoing monitoring: Performance degrades over time as you add content, plugins, or third-party scripts. The agency should set up automated alerts for Core Web Vitals regressions.
Table: Common Core Web Vitals Issues and Fixes

MetricCommon CauseFix
LCP > 2.5sLarge hero images, slow server responseCompress images, use CDN, preload LCP element
CLS > 0.1Ads without dimensions, web fonts loading lateReserve space for ads, use font-display: swap
INP > 200msHeavy JavaScript, third-party scriptsDefer non-critical JS, lazy load below-the-fold content

Risk awareness: Ignoring Core Web Vitals can impact user experience. Slow sites tend to have higher bounce rates and lower conversion. If your agency doesn’t have a performance specialist on staff, consider that a gap.

How to brief: Ask for a performance budget—a document that specifies maximum page weight, number of requests, and LCP target for each page type. Request monthly performance reports that compare against your baseline.

6. Reporting and Communication: What to Expect

A good agency doesn’t just do the work; they explain it in terms you can act on. Your reporting cadence should include:

  • Monthly technical health scores: Crawl errors, index coverage, Core Web Vitals pass rate.
  • Traffic and conversion attribution: Not just “organic traffic up 20%,” but which pages drove the increase and what user intent they served.
  • Link building progress: Number of new links, average DA/TF, and any disavowed links.
  • Action items for you: Some fixes require your team (e.g., content updates, server configuration). The report should clearly separate what the agency handles vs. what you need to do.
Red flags:
  • Reports that only show vanity metrics (total backlinks, keyword rankings for generic terms).
  • Agencies that refuse to share raw data from Search Console or analytics.
  • Overpromising timelines without explaining the competitive landscape.

Final Checklist for Your Agency Briefing

When you meet with potential SEO partners, use this checklist to evaluate their approach:

  • Do they start with a technical audit that covers crawl budget, Core Web Vitals, and duplicate content?
  • Do they explain how crawl budget works for your platform?
  • Do they provide a prioritized fix list with estimated impact?
  • Do they avoid promising guaranteed rankings or instant results?
  • Do they have a documented link building methodology that excludes black-hat tactics?
  • Do they offer ongoing Core Web Vitals monitoring?
  • Do they provide transparent reporting with raw data access?
A credible agency will welcome these questions. If they deflect or rely on jargon without substance, keep looking. Your site’s long-term health depends on getting the technical foundations right—and that starts with choosing a partner who treats SEO as a systematic process, not a quick fix.

For a deeper dive, explore our guides on technical SEO audits and Core Web Vitals optimization.

Wendy Garza

Wendy Garza

Technical SEO Specialist

Elena focuses on site architecture, crawl efficiency, and structured data. She breaks down complex technical issues into clear, actionable steps.

Reader Comments (0)

Leave a comment