The Expert’s Checklist for Selecting and Briefing an SEO Agency on Technical Audits, On-Page Optimization, and Site Performance
When you engage an SEO agency, the difference between a campaign that moves organic traffic and one that merely consumes budget often comes down to how well you brief the technical and performance components. Many businesses treat SEO as a black box—hand over a URL, wait for rankings—but the most effective partnerships are built on a clear, risk-aware definition of scope. This checklist, written from the perspective of an expert practitioner, will guide you through briefing an agency on technical SEO audits, on-page optimization, Core Web Vitals, and the foundational infrastructure that search engines use to discover and rank your content. We will avoid the language of guarantees; any agency that promises a “first page in 30 days” is selling something that cannot be delivered safely. Instead, we focus on measurable, sustainable improvements.
1. Define the Technical SEO Audit Scope Before the First Crawl
A technical SEO audit is not a single report; it is a diagnostic process that examines how search engine bots interact with your site’s architecture. Before the agency begins, you must agree on the depth of the audit. A superficial scan using a tool like Screaming Frog or Sitebulb can identify broken links and missing meta tags, but a thorough audit must also assess crawl budget allocation, server response codes, indexation status, and the interplay between JavaScript rendering and content visibility.
Checklist for the audit brief:
- Crawl budget analysis: Ask the agency to explain how Googlebot allocates crawl budget to your site. If you have thousands of low-value pages (e.g., parameterized URLs, thin affiliate content), the agency should identify which pages waste crawl capacity and propose a strategy to consolidate or exclude them via `robots.txt` or `noindex` directives. A common pitfall is blocking important resources in `robots.txt` while allowing infinite crawl of session IDs—this must be corrected.
- Indexation audit: Request a comparison of pages indexed in Google Search Console versus pages the agency discovers during the crawl. Discrepancies often indicate canonicalization issues, soft 404s, or blocked resources. The agency should produce a list of pages that are “crawled but not indexed” and explain why.
- Duplicate content identification: The audit must flag exact or near-duplicate content across your domain, especially for e-commerce sites with faceted navigation. The agency should recommend canonical tags or parameter handling in Google Search Console. Avoid agencies that suggest mass deletion of duplicates without first analyzing user intent; some duplication is acceptable if it serves distinct search queries.
- Core Web Vitals baseline: The audit must include lab data (from Lighthouse or PageSpeed Insights) and field data (from the Chrome User Experience Report). The agency should identify the specific metrics—LCP, CLS, FID/INP—that fall below the “good” threshold and trace the cause to render-blocking resources, image optimization, or third-party scripts.
| Finding | Typical Cause | Impact on SEO | Recommended Action |
|---|---|---|---|
| Excessive crawl on low-value pages | Parameterized URLs, infinite scroll without `history.pushState` | Wasted crawl budget, delayed indexing of important pages | Implement `canonical` tags, use `robots.txt` disallow for parameter paths, or consolidate pagination |
| Missing or incorrect `canonical` tag | CMS misconfiguration, no self-referencing canonical | Search engines may index the wrong URL version, splitting link equity | Add self-referencing canonical to every page; audit for cross-domain canonicals if content is syndicated |
| LCP > 4.0 seconds (field data) | Unoptimized hero images, render-blocking CSS/JS | Poor user experience, potential ranking penalty under Page Experience algorithm | Compress images, lazy-load below-the-fold content, inline critical CSS |
| CLS > 0.25 | Ads without reserved space, web fonts causing layout shift | High bounce rate, negative Core Web Vitals score | Set explicit width/height on images and ads, use `font-display: swap` |
2. Brief the On-Page Optimization with Intent Mapping, Not Just Keywords
On-page optimization has evolved beyond stuffing keywords into title tags and meta descriptions. A modern brief must center on intent mapping—aligning your content with the search intent behind each target query. The agency should present a keyword research document that clusters terms by intent (informational, navigational, commercial, transactional) and then maps each cluster to a specific page or content type.
How to brief the agency:
- Provide a list of your top 20–50 revenue-generating pages and ask the agency to audit their current on-page signals: title tag, H1, meta description, image alt text, structured data markup (e.g., Product, FAQ, HowTo schema), and internal linking. The audit must flag missing or malformed schema, as this directly impacts eligibility for rich results.
- Require the agency to produce a content gap analysis: for each keyword cluster, identify pages that rank on page 2 or 3 and explain why they underperform. Common reasons include thin content (less than 300 words for informational queries), missing internal links from authoritative pages, or a mismatch between the page’s focus and the user’s intent.
- Avoid briefs that ask for “keyword density” targets. This is an outdated signal. Instead, ask the agency to demonstrate semantic relevance through the use of related terms, LSI keywords (though Google does not use LSI as a formal concept, the principle of topical coverage holds), and natural language that answers the user’s question.

3. Core Web Vitals and Site Performance: Set Measurable Targets
Core Web Vitals are now a ranking signal, but more importantly, they are a user experience signal. When briefing an agency on site performance, you must distinguish between lab data (controlled environment) and field data (real-user metrics). An agency that only optimizes for Lighthouse scores without addressing real-user LCP or INP is doing half the job.
Briefing checklist for performance:
- Define thresholds: Use Google’s “good” thresholds (LCP ≤ 2.5 seconds, FID ≤ 100 ms / INP ≤ 200 ms, CLS ≤ 0.1). Ask the agency to set a target for each metric based on your current field data. If your site’s LCP is 4.5 seconds, a realistic first target is 3.0 seconds, not 2.5.
- Identify optimization levers: The agency should list specific changes—compressing images to WebP or AVIF, removing unused JavaScript, deferring non-critical CSS, implementing a CDN with edge caching, and optimizing server response time (TTFB). Avoid agencies that propose a “magic plugin” that solves all Core Web Vitals; performance is a multi-layered problem.
- Monitor regressions: After the agency implements changes, require a 30-day monitoring period using Google Search Console’s Core Web Vitals report and a Real User Monitoring (RUM) tool like CrUX or a third-party service. If metrics improve but then regress due to a new plugin or third-party script, the agency must have a rollback plan.
| Approach | Benefit | Trade-off | When to Use |
|---|---|---|---|
| Image compression (WebP, lazy loading) | Reduces LCP, saves bandwidth | Older browsers may not support WebP; lazy loading can delay image visibility if not implemented with `loading="lazy"` correctly | Always, but provide fallback to JPEG/PNG |
| Code splitting and tree shaking | Reduces JavaScript bundle size, improves INP | Requires build tool configuration; may break existing functionality if not tested | For sites with large React/Angular bundles |
| Server-side rendering (SSR) vs. static site generation (SSG) | SSR improves LCP for dynamic content; SSG offers fastest TTFB | SSR increases server load; SSG requires rebuild on content change | SSG for content-heavy sites; SSR for personalized or real-time data pages |
4. Crawl Budget, XML Sitemaps, and Robots.txt: The Infrastructure Brief
These three components form the technical foundation that tells search engines what to crawl and what to ignore. A poorly configured `robots.txt` can accidentally block your entire site from indexing, while an outdated XML sitemap can waste crawl budget on deleted pages.
Briefing the agency:
- XML sitemap: Require the agency to generate a fresh sitemap that includes only canonical, indexable pages. Exclude paginated pages (unless they contain unique content), parameterized URLs, and pages with `noindex` directives. The sitemap must be submitted to Google Search Console and updated automatically whenever content is published or removed.
- Robots.txt: Ask the agency to review your current `robots.txt` for errors. Common mistakes include blocking CSS/JS files (which prevents Google from rendering the page), disallowing entire directories that contain important content, or using `Disallow: /` accidentally. The agency should test the file using Google’s robots.txt tester in Search Console.
- Crawl budget optimization: If your site has over 10,000 URLs, the agency should analyze server log files to see which pages Googlebot actually crawls versus which pages you want crawled. This often reveals that Googlebot is wasting time on infinite calendar pages, faceted navigation filters, or duplicate product pages. The agency should then propose a strategy to reduce the crawl surface area.
5. Link Building and Backlink Profile: Brief for Quality, Not Volume
Link building remains a high-risk area. A single bad backlink profile can trigger a manual action or algorithmic penalty. When briefing an agency on link acquisition, you must emphasize quality signals over quantity.

Briefing checklist for link building:
- Define acceptable link sources: Ask the agency to provide a list of domains they plan to target. These should have editorial relevance to your industry, a healthy backlink profile themselves, and a track record of not selling links. Use metrics like Domain Authority (DA) or Trust Flow (TF) as rough guides, but do not treat them as absolute thresholds. A link from a low-DA but highly relevant industry blog often carries more value than a high-DA link from a generic directory.
- Reject black-hat tactics: Explicitly state in the brief that you will not accept links from private blog networks (PBNs), paid link schemes, automated directory submissions, or comment spam. The agency must provide a link acquisition strategy that relies on content creation, digital PR, or genuine outreach.
- Audit existing backlinks: Before starting new link building, the agency should perform a full backlink profile audit using tools like Ahrefs or Majestic. They should identify toxic links (e.g., from gambling, adult, or spam sites) and disavow them if necessary. However, disavow only if there is a manual action or a clear pattern of manipulative links; do not disavow links preemptively unless the profile is obviously harmful.
| Approach | Potential Reward | Risk Level | Due Diligence Required |
|---|---|---|---|
| Guest posting on relevant industry blogs | High—if the host site has editorial authority | Medium—some blogs accept low-quality posts | Verify host site’s traffic, spam score, and editorial standards |
| Digital PR (newsjacking, original research) | Very high—can generate natural editorial links | Low—if content is genuinely newsworthy | Requires investment in data or creative assets |
| Unlinked brand mentions | Moderate—easy to claim existing mentions | Low—purely non-manipulative | Use tools to find mentions without links; outreach for link insertion |
| Paid links (any form) | Short-term gains, high risk of penalty | Very high—Google’s webspam team actively targets paid links | Avoid entirely; no amount of due diligence makes paid links safe |
6. Analytics, Reporting, and the Ongoing Brief
Finally, the briefing must include a reporting cadence that ties SEO activities to business outcomes. Avoid vanity metrics like “total keywords in top 10” without context. Instead, ask the agency to report on:
- Organic traffic to revenue-generating pages (not just homepage).
- Core Web Vitals field data trends month over month.
- Crawl budget efficiency (number of important pages crawled vs. total crawled).
- Backlink profile health (new links gained, lost, and any toxic links disavowed).
Summary: The Expert’s Closing Advice
A well-briefed SEO agency can transform your site’s technical health, on-page relevance, and performance metrics. But the brief is a two-way conversation: you must provide clear expectations, and the agency must provide transparent methodologies. Avoid any partner that promises “instant results” or “guaranteed rankings.” Instead, look for an agency that can articulate how they will improve crawl efficiency, resolve Core Web Vitals issues, and build a link profile that withstands algorithm updates. Use this checklist as your starting point, and remember that SEO is a long-term investment in your site’s infrastructure, not a quick fix.
For further reading on technical SEO fundamentals, see our guide on conducting a site health audit and the on-page optimization checklist.

Reader Comments (0)