The SEO Agency Services Checklist: What to Expect from Technical Audits, On-Page Optimization & Site Performance
You’ve decided to hire an SEO agency—or maybe you’re evaluating your current one. The pitch deck promised “comprehensive technical audits,” “on-page optimization,” and “site performance improvements.” But when you dig into the deliverables, the gap between the sales language and the actual work can feel like a chasm. This checklist is designed to bridge that gap. It’s a practical, risk-aware guide to the core services a reputable SEO agency should provide, broken down by technical audits, on-page work, and performance tuning. Use it to brief your agency, audit their output, or set your own internal standards. No guarantees, no magic bullets—just the systematic work that actually moves the needle.
What a Technical SEO Audit Actually Covers
A technical SEO audit is not a one-time health check; it’s a diagnostic process that identifies barriers preventing search engines from crawling, indexing, and rendering your site effectively. A proper audit should cover at least five foundational areas: crawlability, indexation, site architecture, duplicate content, and Core Web Vitals. If your agency hands you a PDF with only a few issues flagged, they’re probably skimming the surface.
Start with crawl budget—the allocation of resources search engines dedicate to your site. For large sites (over 10,000 pages), inefficient crawl paths can waste this budget on low-value pages like session IDs or infinite scroll archives. The agency should analyze server logs (not just crawl tools) to identify which URLs Googlebot actually visits, how often, and where it gets stuck. A robot.txt misconfiguration—like blocking important CSS or JS files—can cripple rendering. The fix is often a combination of cleaning up the robots.txt directives, consolidating thin content, and improving internal linking to guide crawlers toward priority pages.
Next, the audit must address XML sitemaps and canonical tags. A sitemap that includes 50,000 URLs but hasn’t been updated in six months is worse than no sitemap—it misleads crawlers about your site’s current structure. The agency should verify that each sitemap is dynamically generated, excludes noindexed pages, and is submitted via Google Search Console. For canonical tags, the most common failure is using them inconsistently: a product page might have a self-referencing canonical on one version but point to a different URL on a mobile variant. The audit should flag every instance of conflicting or missing canonicals, especially for paginated content, parameter-heavy URLs, and syndicated articles.
Duplicate content issues are often subtler than you think. It’s not just about identical text across domains; it’s about near-duplicate product descriptions, filtered category pages, and printer-friendly versions. A thorough audit will use a tool like Screaming Frog or Sitebulb to cluster similar pages, then recommend either consolidation (301 redirects), noindexing, or canonicalization. The agency should also check for duplicate meta titles and descriptions—a low-effort signal that often indicates neglected CMS hygiene.
Finally, the audit must include a Core Web Vitals assessment. This isn’t just a Lighthouse score; it’s a field-data analysis from Chrome User Experience Report (CrUX). The agency should identify the specific metrics dragging you down—Largest Contentful Paint (LCP) over 4 seconds, First Input Delay (FID) above 100ms, or Cumulative Layout Shift (CLS) above 0.25—and trace them back to root causes like uncompressed images, render-blocking JavaScript, or third-party scripts. A good audit will prioritize fixes by impact, not by ease of implementation.
Table: Common Technical Audit Findings and Their Impact
| Issue | Typical Cause | Potential Impact | Priority Level |
|---|---|---|---|
| Crawl budget wasted on low-value pages | Infinite scroll, session IDs, noindexed pages in sitemap | Indexation delays for important content | High |
| Robots.txt blocking critical resources | Misconfigured `Disallow` directives for CSS/JS | Incomplete rendering, poor Core Web Vitals | Critical |
| Missing or conflicting canonical tags | CMS automation, mobile/desktop split | Diluted link equity, indexation confusion | High |
| Duplicate meta titles/descriptions | Template-based CMS, lack of editorial oversight | Reduced click-through rates, poor user experience | Medium |
| LCP > 4 seconds | Unoptimized hero images, render-blocking scripts | Poor ranking on mobile, high bounce rate | High |
On-Page Optimization: Beyond Keyword Stuffing
On-page optimization has evolved from stuffing keywords into title tags to a discipline centered on search intent, content relevance, and user experience. A competent agency will not just tweak meta fields; they’ll conduct keyword research and intent mapping to align your content with what users actually search for at different stages of the funnel.

The process begins with keyword discovery. The agency should use tools like Ahrefs, Semrush, or Google Keyword Planner to identify terms with realistic search volume and competition. But the real value comes from intent mapping: categorizing keywords into informational (e.g., “how to fix a leaky faucet”), navigational (“Home Depot plumbing near me”), commercial (“best faucet brands 2025”), and transactional (“buy Delta faucet online”). For each category, the agency must recommend a content format that matches the intent—a how-to guide for informational queries, a product comparison table for commercial ones, a location page for navigational ones. If they suggest the same blog post structure for all keywords, they’re ignoring the core principle of search intent.
Content strategy is the next layer. A good agency will audit your existing content against the keyword map, identifying gaps (topics you should cover but don’t), overlaps (multiple pages competing for the same query), and opportunities (high-volume terms where your content is thin). They’ll then create an editorial calendar that prioritizes pages based on business value and ranking potential. But here’s the risk: some agencies focus only on volume, pushing you to write 50 blog posts a month without considering quality or internal linking. A better approach is to create fewer, deeper pages that serve as topical hubs—a pillar page on “SEO for e-commerce” with cluster posts on “product page optimization,” “category page structure,” and “technical audits for Shopify.” This hub-and-spoke model signals topical authority to search engines.
On-page optimization also includes technical elements like heading hierarchy, image alt text, and internal link structure. The agency should ensure that each page has a single H1 that matches the primary keyword, H2s that break down subtopics, and alt text that describes images accurately (not keyword-stuffed). Internal links should be contextual, not generic “click here” anchors. A common mistake is linking to the homepage from every page—that’s wasted link equity. Instead, the agency should create a silo structure where category pages link to subcategories, which link to product pages, creating a logical flow of authority.
Table: On-Page Optimization Components
| Component | Best Practice | Common Pitfall | Risk |
|---|---|---|---|
| Title tag | 50-60 characters, primary keyword near front, unique per page | Same title for multiple pages, keyword stuffing | Duplicate content signal, poor CTR |
| Meta description | 150-160 characters, compelling call-to-action, includes secondary keyword | Auto-generated from CMS, missing description | Lower CTR, missed opportunity |
| Heading structure | H1 for primary keyword, H2/H3 for subtopics, logical hierarchy | Multiple H1s, skipping heading levels | Confuses search engines, poor readability |
| Image alt text | Descriptive, includes keyword naturally, under 125 characters | Keyword stuffing, empty alt tags | Accessibility issues, missed image search traffic |
| Internal links | Contextual, uses descriptive anchor text, links to relevant pages | Generic anchors (“here”), excessive links to homepage | Diluted link equity, poor user navigation |
Site Performance: Core Web Vitals and Beyond
Site performance is no longer a nice-to-have; it’s a ranking factor for both desktop and mobile. But many agencies treat Core Web Vitals as a checklist item rather than an ongoing optimization process. The difference between a good agency and a mediocre one is how they approach performance—do they run a Lighthouse report once and call it done, or do they implement continuous monitoring and iterative fixes?
Start with LCP optimization. The largest content element on your page—usually a hero image or a large text block—must load within 2.5 seconds. The agency should identify the specific element and the resource that delays it. Common fixes include preloading the hero image, using next-gen formats like WebP or AVIF, implementing lazy loading for below-the-fold images, and deferring non-critical CSS and JavaScript. If the LCP element is a text block, the fix might involve optimizing font delivery (using `font-display: swap` and preloading key fonts). A good agency will also check server response times—if your Time to First Byte (TTFB) is over 800ms, the problem might be hosting, caching, or a slow database query.
CLS is about visual stability. A page that shifts layout after the user starts interacting with it is frustrating and penalized by Google. The agency should audit for common CLS triggers: images without explicit width/height attributes, dynamically injected ads, embedded videos that load late, and custom fonts that cause FOIT (Flash of Invisible Text) or FOUT (Flash of Unstyled Text). Fixes include setting aspect ratios for all media, reserving space for ads and embeds, and using `size-adjust` for font swaps. If the page uses a third-party widget (like a chatbot or a social media feed), the agency should test whether it can be loaded asynchronously or deferred until after the main content.
FID (soon to be replaced by Interaction to Next Paint, or INP) measures responsiveness. A high FID means users experience delays when clicking buttons or tapping links. The root cause is almost always JavaScript: heavy frameworks, unoptimized event handlers, or third-party scripts that block the main thread. The agency should run a performance budget—a set of limits on script size, number of requests, and time to interactive. They should also implement code splitting (loading only the JavaScript needed for the current page), defer all non-critical scripts, and use a service worker for caching. If the site uses a single-page application framework like React or Vue, the agency must ensure that server-side rendering or static generation is in place to avoid an empty white screen.

Link Building: The Risk-Aware Approach
Link building is the most controversial aspect of SEO services. A reputable agency will never promise “guaranteed first page ranking” or “instant results” through link building—those are red flags for black-hat tactics like private blog networks (PBNs), paid links, or automated outreach. Instead, they should offer a transparent, risk-aware strategy that focuses on earning links through quality content and genuine relationships.
The first step is a backlink profile audit. The agency should analyze your existing links using tools like Majestic or Ahrefs, focusing on Trust Flow and Domain Authority metrics. A healthy profile has a mix of high-authority editorial links, niche-relevant directories, and contextual links from industry publications. A toxic profile includes links from spammy directories, link farms, or sites with low trust scores. The agency should identify which links to disavow (using Google’s Disavow Tool) and which to preserve. They should also check for unnatural link patterns—like a sudden spike in links from unrelated sites—that could trigger a manual penalty.
For new link acquisition, the agency should propose a content-driven approach. This could include creating data-driven research (e.g., an industry report with original statistics), guest posting on reputable sites in your niche, or building resource pages that naturally attract links. The key is relevance: a link from a high-authority site in your industry is worth more than ten links from generic directories. The agency should also monitor your competitors’ link profiles to identify gaps and opportunities. For example, if a competitor has backlinks from a listicle of “top 10 tools for marketers,” you could pitch your tool for inclusion in an updated version of that article.
Table: Link Building Approaches
| Approach | Description | Risk Level | Typical Results Timeline |
|---|---|---|---|
| Guest posting | Writing articles for other sites in your niche | Low (if relevant and high-quality) | 2-4 months |
| Broken link building | Finding broken links on other sites and offering your content as replacement | Low | 1-3 months |
| Digital PR | Creating newsworthy content (studies, infographics) that journalists link to | Medium (requires effort) | 3-6 months |
| PBN links | Using a network of private sites to create artificial links | High (penalty risk) | Immediate but temporary |
| Paid links | Buying links from high-authority sites | High (violates Google guidelines) | Immediate but risky |
How to Brief an SEO Agency: A Practical Checklist
When you brief an SEO agency, avoid vague requests like “improve our rankings.” Instead, provide specific, measurable goals and the data they need to start. Here’s a step-by-step checklist:
- Define your primary KPIs. Are you after organic traffic growth, conversion rate improvements, or brand visibility? Choose one or two metrics that align with business goals.
- Share access to analytics. Grant view-only access to Google Analytics, Google Search Console, and any server logs. The agency needs historical data to identify trends and anomalies.
- Provide a list of target keywords. Include both high-volume and long-tail terms, along with your current ranking positions (if known). The agency will cross-reference this with their own tools.
- Disclose past SEO work. If you’ve had previous agencies or in-house efforts, share what was done—especially any link building campaigns, redirect chains, or CMS migrations. This helps avoid repeating mistakes.
- Set a realistic timeline. SEO is a long-term investment. Expect noticeable improvements in 4-6 months for on-page changes, and 6-12 months for link building and authority growth.
- Agree on reporting frequency. Monthly reports that cover rankings, traffic, and conversion data are standard. But also ask for quarterly deep dives on technical health and backlink profile changes.
- Establish a risk tolerance. Discuss what tactics are off-limits (e.g., PBNs, automated content, keyword stuffing) and what the penalty mitigation plan would be if something goes wrong.
Summary: What to Expect from a Top SEO Agency
A top SEO agency doesn’t promise instant results or guaranteed rankings. Instead, they deliver a systematic process: a thorough technical audit that identifies crawl and indexation issues, an on-page strategy that aligns content with search intent, a performance optimization plan that improves Core Web Vitals, and a risk-aware link building approach that earns authority over time. They use data to prioritize fixes, not to overwhelm you with a hundred low-priority tasks. And they communicate transparently—showing you what’s working, what’s not, and why.
The checklist above is your starting point. Use it to vet agency proposals, evaluate ongoing work, or refine your own in-house SEO processes. Remember: the best SEO agency is the one that treats your site as a long-term asset, not a quick-win project.

Reader Comments (0)