The SEO Agency Checklist: How to Vet Technical Audits, On-Page Optimization & Core Web Vitals

The SEO Agency Checklist: How to Vet Technical Audits, On-Page Optimization & Core Web Vitals

You’ve hired an SEO agency—or you’re about to. The pitch deck was slick, the case studies were shiny, and the account manager promised “organic growth.” But here’s the uncomfortable truth: SEO is not a magic wand. It’s a systematic process of diagnosing, fixing, and monitoring technical and content signals that search engines use to rank pages. If your agency isn’t running a proper technical audit, mapping keywords to user intent, or addressing Core Web Vitals, you’re paying for decoration, not results.

This checklist walks you through the essential deliverables you should expect from a top-tier SEO services agency. Use it to brief your team, audit your current provider, or build your own internal process. We’ll cover technical audits, crawl budget management, on-page optimization, Core Web Vitals, and link building—with the risks that come from shortcuts.

1. The Technical SEO Audit: What a Real Site Audit Looks Like

A technical audit is the foundation. Without it, you’re optimizing a house with a cracked foundation. A proper audit goes beyond a quick crawl report; it involves analyzing server logs, indexing status, and structural issues that prevent search engines from finding and ranking your content.

What to expect in a thorough technical audit:

  • Crawl analysis: The agency should examine your crawl budget—how Googlebot allocates its time across your site. If you have thousands of thin pages, broken links, or infinite filter parameters, bots waste time there instead of on your money pages.
  • Index coverage: Using tools like Google Search Console, they should identify which pages are indexed, which are excluded, and why. Common culprits: noindex tags, blocked resources, or poor internal linking.
  • Duplicate content and canonical tags: Duplicate content dilutes ranking signals. The agency must verify that every page has a self-referencing canonical tag (or a cross-domain canonical where needed) and that no two pages compete for the same keyword.
  • robots.txt and XML sitemap: These files control what gets crawled and how. A misconfigured robots.txt can block your entire site; an outdated XML sitemap can leave new pages invisible. The agency should test both and ensure they’re aligned with your site architecture.
Common risks to watch for:
  • Over-optimization: Fixing everything at once can trigger algorithmic penalties. A good agency prioritizes issues by impact.
  • Wrong redirects: Using 302s instead of 301s for permanent moves, or implementing redirect chains, wastes crawl budget and confuses users.
  • Ignoring Core Web Vitals: If the audit doesn’t mention LCP, CLS, or INP, it’s incomplete. These metrics are considered ranking signals.
Comparison: Types of SEO Audits

Audit TypeScopeTypical ToolsOutput
Quick crawl audit1,000–10,000 URLsScreaming Frog, SitebulbList of broken links, missing meta tags, duplicate content
Full technical auditAll pages + server logs + GSC dataDeepCrawl, Botify, Google Search ConsolePrioritized issues with impact scoring, crawl budget analysis, Core Web Vitals report
Competitive technical auditYour site vs. top 5 competitorsAhrefs, Semrush, BotifyGap analysis: what competitors do better in site structure, speed, and indexation

Action step: Ask your agency for a sample audit report before signing. If it’s just a list of 404s and missing alt text, they’re not doing deep work.

2. Crawl Budget: Why It Matters and How to Optimize It

Crawl budget is the number of URLs Googlebot will crawl on your site within a given timeframe. For small sites (under 1,000 pages), it’s rarely an issue. For large e-commerce sites, news portals, or SaaS platforms with thousands of product pages, it’s critical.

How a good agency handles crawl budget:

  • Identify waste: They’ll look for paginated archives, session IDs, sort parameters, and other low-value URLs that consume crawl slots.
  • Consolidate thin content: Pages with little to no unique content—like auto-generated tag pages or old blog posts—should be merged, noindexed, or removed.
  • Use robots.txt and sitemaps strategically: Block irrelevant directories (e.g., /search/ or /cart/) while ensuring high-value pages appear in the sitemap.
  • Monitor crawl stats in GSC: A sudden drop in crawl requests can signal a server issue or a penalty. The agency should set up alerts.
What can go wrong:
  • Over-blocking: Blocking too many pages with robots.txt can prevent Google from discovering new content.
  • Under-blocking: Leaving infinite filter combinations crawlable can cause “crawl bloat,” where bots waste resources on near-duplicate pages.
  • Ignoring server response times: If your server takes 3 seconds to respond, Googlebot may reduce crawl rate. Core Web Vitals optimization directly helps here.
Practical tip: If your agency mentions “crawl budget” only in passing, ask them to show you the crawl stats from your server logs. Real data beats assumptions.

3. On-Page Optimization: Beyond Keywords and Meta Tags

On-page optimization isn’t just stuffing keywords into title tags. It’s about aligning every page element with search intent—what the user actually wants when they type a query.

Key on-page elements that matter:

  • Title tags and meta descriptions: Unique, descriptive, and within character limits. Avoid keyword stuffing; write for clicks.
  • Header structure (H1–H6): One H1 per page that matches the main topic. Subheadings should break content logically, not just include keywords.
  • Content quality and length: Thin content (under 300 words) rarely ranks for competitive terms. But length alone isn’t enough—the content must answer the query comprehensively.
  • Internal linking: Links to related pages help distribute authority and guide users. A good agency will map out a link graph, not just add random links.
  • Image optimization: Compressed images, descriptive alt text, and proper file names. Slow images hurt Core Web Vitals.
Intent mapping example:

QueryIntent TypePage Type NeededExample
“best SEO tools 2025”Commercial investigationComparison or listicle“10 Best SEO Tools for 2025: Tested and Reviewed”
“how to fix 404 errors”InformationalStep-by-step guide“How to Fix 404 Errors: A Technical SEO Guide”
“buy SEO audit service”TransactionalService page with pricing“Technical SEO Audit Service – Get a Full Site Analysis”

Risk alert: Some agencies use automated tools to “optimize” hundreds of pages at once. This often results in duplicate meta tags, unnatural keyword repetition, and poor user experience. Insist on manual or semi-automated optimization for high-value pages.

4. Core Web Vitals: The Performance Metric You Can’t Ignore

Core Web Vitals (LCP, CLS, INP) are user experience metrics that have been tied to ranking. They measure loading speed, visual stability, and interactivity.

What each metric means:

  • LCP (Largest Contentful Paint): How long the main content takes to load. Target: under 2.5 seconds.
  • CLS (Cumulative Layout Shift): How much the page layout shifts unexpectedly. Target: under 0.1.
  • INP (Interaction to Next Paint): How responsive the page is to user input. Target: under 200ms.
How an agency should address Core Web Vitals:
  • Audit with real-user data: Use Chrome User Experience Report (CrUX) and Lighthouse. Lab data alone is misleading.
  • Prioritize fixes: Optimize images, defer non-critical JavaScript, implement lazy loading, and use a CDN.
  • Monitor over time: Core Web Vitals fluctuate with traffic, third-party scripts, and site updates. The agency should set up ongoing monitoring.
What can go wrong:
  • Over-optimization: Aggressively deferring all JavaScript can break functionality. Test before deploying.
  • Ignoring mobile: Core Web Vitals are measured separately for mobile and desktop. Mobile is often worse.
  • Using quick fixes: Adding a “loading=’lazy’” attribute to every image doesn’t solve LCP if the hero image isn’t prioritized.
Table: Common Core Web Vitals Issues and Fixes

IssueMetric AffectedTypical Fix
Large hero imageLCPCompress, resize, use next-gen formats (WebP, AVIF)
Third-party scripts (analytics, ads)LCP, INPDefer or async load, reduce number of scripts
Dynamic content without dimensionsCLSSet explicit width/height on images and iframes
Slow server response timeLCPUse CDN, optimize database queries, upgrade hosting
Unoptimized CSSLCPRemove unused CSS, inline critical CSS

Action step: Ask your agency to show you the CrUX report for your site before they start work. If they can’t, they’re not measuring real user experience.

5. Link Building: Quality Over Quantity, Always

Link building remains a strong ranking factor, but it’s also the most dangerous area for shortcuts. Black-hat tactics—like buying links from private blog networks (PBNs), using automated outreach, or participating in link exchanges—can lead to manual penalties.

What a responsible link building campaign looks like:

  • Backlink profile audit first: The agency should analyze your existing backlinks using tools like Ahrefs or Majestic. They’ll look for toxic links (from spammy sites, irrelevant directories, or PBNs) and disavow them if necessary.
  • Targeted outreach: They identify relevant, authoritative sites in your industry and pitch content that adds value—guest posts, resource pages, or broken link replacements.
  • Diverse link types: A healthy profile includes editorial links, directory listings (from reputable sources), and links from social media. Avoid over-reliance on one type.
  • Anchor text distribution: Exact-match anchor text (e.g., “best SEO agency”) should be rare. Branded, generic, and naked URLs should dominate.
Metrics that matter:
  • Domain Authority (DA) / Domain Rating (DR): A useful benchmark, but not a ranking factor itself. Focus on relevance, not just high DA.
  • Trust Flow (TF): Measures the quality of links pointing to the linking domain. A high TF with low Citation Flow (CF) is often seen as a sign of a cleaner profile.
  • Referring domains: More is better, but only if they’re diverse and relevant. 50 links from 50 different sites is stronger than 500 links from the same site.
Risks to watch for:
  • Buying links: Google’s Webmaster Guidelines explicitly prohibit paid links that pass PageRank. If an agency offers “guaranteed links from DA 50+ sites” for a flat fee, run.
  • Automated outreach: Mass emailing with generic templates often results in low-quality placements and can damage your brand reputation.
  • Ignoring disavow: If you have a history of bad links, the agency must disavow them. Not doing so leaves you vulnerable to penalties.
Comparison: Link Building Approaches

ApproachRisk LevelTypical CostTime to Results
White-hat guest postingLowHigh (content creation + outreach)3–6 months
Broken link buildingLowMedium2–4 months
PBN linksVery highLow to mediumImmediate, but short-lived
Directory submissionsMediumLow1–3 months
Skyscraper techniqueLowHigh4–8 months

Action step: Ask for a link building strategy document that includes target sites, outreach templates, and a disavow process. If they can’t provide one, they’re flying blind.

6. Content Strategy: The Bridge Between Technical SEO and Users

Technical fixes get you indexed; content gets you ranked. A content strategy that ignores search intent is a waste of resources.

What a solid content strategy includes:

  • Keyword research with intent mapping: Not just “high volume keywords,” but understanding whether the user wants to learn, compare, or buy. Use tools like Semrush or Ahrefs to group keywords by intent.
  • Content gap analysis: Compare your site’s content against competitors. What topics are they covering that you’re not? What questions are they answering that you’re ignoring?
  • Editorial calendar: A schedule of content creation that aligns with business goals, seasonal trends, and keyword opportunities.
  • Content optimization: Existing content should be updated, not just rewritten. Add new sections, improve readability, and ensure internal links point to relevant pages.
Risk alert: Some agencies churn out low-quality blog posts at scale, targeting long-tail keywords with little competition. This can work temporarily, but Google’s Helpful Content Update aims to penalize content that doesn’t provide real value. Focus on depth and authority, not volume.

7. Analytics and Reporting: What to Track (and What to Ignore)

Reporting is where many agencies fall short. They show you vanity metrics—like “total visits” or “keywords in position 1–10”—while hiding the real story.

Metrics that matter:

  • Organic traffic to high-value pages: Not just total traffic, but traffic to pages that drive conversions (product pages, service pages, lead magnets).
  • Keyword rankings by intent: Track rankings for transactional keywords separately from informational ones.
  • Core Web Vitals scores: Monthly trends for LCP, CLS, and INP.
  • Conversion rate from organic traffic: Are visitors from search actually converting? If not, the content or landing page needs work.
  • Backlink growth: New referring domains, lost links, and anchor text distribution.
Metrics to be skeptical of:
  • “Total keywords in top 10”: Many of those could be branded terms or low-volume queries.
  • “Domain Authority increase”: DA is a third-party metric; Google doesn’t use it. Focus on organic traffic and conversions.
  • “Page speed score” from Lighthouse alone: Lab data is useful, but real-user data (CrUX) is what Google uses.
Table: Reporting Cadence by Metric

MetricFrequencyNotes
Organic trafficWeeklyTrack trends, not daily fluctuations
Keyword rankingsBi-weeklyFocus on priority keywords
Core Web VitalsMonthlyUse CrUX data
Backlink profileMonthlyNew links, lost links, toxic links
Conversion rateMonthlySegment by traffic source

Action step: Request a sample report before hiring the agency. If it’s a PDF with pretty charts but no actionable insights, keep looking.

Summary: Your Agency Briefing Checklist

When you brief an SEO agency—or evaluate your current one—use this checklist:

  1. Technical Audit: Does the audit include crawl budget analysis, server logs, and Core Web Vitals?
  2. Crawl Budget Optimization: Are they identifying waste and blocking low-value pages?
  3. On-Page Optimization: Are they aligning content with search intent, not just keyword stuffing?
  4. Core Web Vitals: Are they using real-user data (CrUX) and prioritizing fixes by impact?
  5. Link Building: Is the approach white-hat, with a disavow process for toxic links?
  6. Content Strategy: Does the strategy include intent mapping, gap analysis, and an editorial calendar?
  7. Reporting: Are they tracking meaningful metrics, not vanity numbers?
A good agency will welcome these questions. A mediocre one will deflect. Remember: SEO is a long-term investment, not a quick fix. The agencies that promise instant results or guaranteed rankings are the ones to avoid. Focus on the process, and the results will follow.

If you’re looking to dive deeper, check out our guides on technical SEO audits and Core Web Vitals optimization.

Wendy Garza

Wendy Garza

Technical SEO Specialist

Elena focuses on site architecture, crawl efficiency, and structured data. She breaks down complex technical issues into clear, actionable steps.

Reader Comments (0)

Leave a comment