You’ve hired an SEO agency—or you’re about to. The pitch sounded polished: “We’ll fix your technical foundation, optimize every page, and make your site fly.” But months in, your organic traffic may be flat, your Core Web Vitals still blink red in Search Console, and the monthly reports are heavy on vanity metrics like “impressions” but light on actionable fixes. Sound familiar?
Here’s the uncomfortable truth: some SEO agencies talk a good game on technical audits and on-page optimization, but many skip the grunt work that actually moves the needle. They’ll run a generic crawl, slap together a keyword list, and call it a strategy. Meanwhile, your crawl budget is leaking on thin pages, your canonical tags are pointing at the wrong URLs, and your content strategy is built on search volume alone—ignoring what users actually need.
This checklist is your defense. Use it to brief your agency, evaluate their deliverables, or run your own internal audit. We’ll cover the non-negotiable steps for a technical SEO audit, on-page optimization that aligns with search intent, and site performance improvements that Google actually rewards. No fluff, no guarantees—just the practices that separate effective SEO from expensive theater.
The Technical SEO Audit: What Your Agency Should Actually Check
A proper technical audit isn’t just about finding 404s and missing meta descriptions. It’s about understanding how search engines discover, crawl, and render your pages—and fixing the bottlenecks that waste your crawl budget. Many agencies run a tool like Screaming Frog or Sitebulb, export a CSV, and call it done. But the real value comes from interpretation: why is your crawl rate low? Why are certain pages stuck in “crawled – not indexed”?
Here’s what a thorough technical audit should include:
- Crawl budget analysis: Review server logs or use a log analyzer to see which pages Googlebot actually hits. Compare that to your XML sitemap. If high-priority pages aren’t being crawled, your agency should investigate why—slow server response, excessive redirect chains, or a bloated sitemap with low-value URLs.
- robots.txt and XML sitemap hygiene: Ensure robots.txt isn’t accidentally blocking critical resources (CSS, JS, images) and that your sitemap only includes indexable, canonical pages. Remove pages with noindex tags, redirects, or 404s from the sitemap.
- Canonical tag audit: Check that every page has a self-referencing canonical or an explicitly set canonical to the preferred version. Duplicate content issues often stem from missing or conflicting canonicals—especially on e-commerce sites with faceted navigation.
- Duplicate content detection: Identify near-identical pages (e.g., product pages with only color variations) and consolidate them using canonical tags or 301 redirects. Thin duplicate content wastes crawl budget and dilutes ranking signals.
- Core Web Vitals assessment: Measure actual LCP, CLS, and FID/INP data from the Chrome User Experience Report (CrUX) and field data in Search Console. Lab data from Lighthouse is useful for debugging, but field data tells you what real users experience. A good agency will prioritize fixes that impact the most visited pages.
- Indexation status review: Use the URL Inspection tool in Search Console to spot-check a sample of pages. Are they indexed? If not, why? Common reasons: noindex tag, blocked by robots.txt, server errors, or low-quality content that Google decided not to index.

| Finding | Potential Impact | Recommended Fix |
|---|---|---|
| High number of 4xx/5xx errors | Wasted crawl budget, poor user experience | Fix broken links, implement proper redirects |
| Missing or incorrect canonical tags | Duplicate content, diluted ranking signals | Set self-referencing canonicals or point to preferred URL |
| Large, uncompressed images | High LCP, poor Core Web Vitals | Compress and serve next-gen formats (WebP) |
| Bloated sitemap (10,000+ URLs) | Slow crawl discovery, irrelevant pages indexed | Trim to include only indexable, high-value pages |
| Excessive redirect chains | Slowed crawl efficiency, lost link equity | Flatten chains to direct 301s |
On-Page Optimization: Beyond Keyword Stuffing
On-page optimization has evolved beyond stuffing a target keyword into the title tag and H1. Modern on-page SEO is about aligning content with search intent—what the user actually wants when they type a query. Your agency should be mapping keywords to intent categories (informational, navigational, commercial, transactional) and crafting content that satisfies that need.
Here’s how to evaluate your agency’s on-page approach:
- Keyword research with intent mapping: They shouldn’t just hand you a spreadsheet of high-volume terms. They should categorize keywords by intent and recommend content types (blog posts, product pages, guides) that match. For example, “best SEO tools” is commercial intent—it needs a comparison page, not a blog post.
- Content strategy that fills gaps: A content strategy should identify topics your site doesn’t cover but competitors do. Use tools like Ahrefs or Semrush to find “content gaps”—keywords where rivals rank but you don’t. Then prioritize by search volume and relevance to your business.
- Title tags and meta descriptions: Each page should have a unique title tag under 60 characters and a meta description under 160 characters. Include the primary keyword naturally, but write for clicks—not just search engines. A good meta description answers “why click this result?”
- Header structure and readability: Use a single H1 that matches the page topic, then H2s for major sections, H3s for subsections. This helps search engines understand page hierarchy and improves readability. Avoid keyword stuffing in headers.
- Internal linking with anchor text: Link to relevant internal pages using descriptive anchor text. This distributes link equity and helps users (and Google) discover related content. A common mistake: linking with “click here” or generic phrases.
- Image optimization: Every image should have a descriptive alt text (not keyword-stuffed) and a compressed file size. Use lazy loading for below-the-fold images to improve LCP.
| Search Query | Intent | Recommended Content Type | Example |
|---|---|---|---|
| “how to improve site speed” | Informational | Blog post or guide | Step-by-step tutorial |
| “best SEO agency for e-commerce” | Commercial | Comparison or listicle | “Top 5 E-commerce SEO Agencies” |
| “SEO audit tool pricing” | Transactional | Product or pricing page | Direct tool comparison |
| “Google Search Console setup” | Navigational | Setup guide or resource | “How to Add Your Site to GSC” |
Site Performance: Why Core Web Vitals Matter (and What Can Go Wrong)
Core Web Vitals aren’t just a ranking factor—they’re a user experience metric. Google’s page experience update made LCP, CLS, and FID/INP part of the ranking algorithm. But here’s where agencies often go wrong: they optimize for lab data (Lighthouse) instead of field data (CrUX). A page might score high in Lighthouse but still have poor LCP for real users on slow connections.
What can go wrong with site performance work:
- Wrong redirects: Implementing 302 redirects instead of 301s for permanent moves can split link equity and confuse search engines. Always use 301 for permanent URL changes.
- Black-hat link building disguised as performance: Some agencies bundle “link building” with site speed work, promising fast results. Be wary of any agency that offers guaranteed ranking improvements or uses automated link schemes. A spike in low-quality backlinks can trigger a manual penalty.
- Over-optimization of Core Web Vitals: Aggressively lazy-loading above-the-fold images or removing JavaScript entirely can break functionality. The goal is balance—fast loading without sacrificing user experience.
- Neglecting mobile performance: Desktop Lighthouse scores are easy to fix. Mobile performance—especially on 3G connections—requires more work. Ensure your agency tests on real mobile devices, not just emulated ones.
- Measure baseline: Use CrUX data in Search Console and tools like PageSpeed Insights to identify pages with poor LCP (>2.5s), CLS (>0.1), or FID/INP (>200ms).
- Prioritize fixes by impact: Fix issues on your most visited pages first. A slow homepage hurts more than a slow blog post from 2019.
- Optimize images and videos: Compress images, serve WebP, and use responsive image sizes. For videos, use lazy loading and avoid auto-play.
- Reduce server response time: If your TTFB (time to first byte) is high, investigate hosting, CDN, and server-side caching.
- Minimize render-blocking resources: Defer non-critical CSS and JavaScript. Inline critical CSS for above-the-fold content.
- Monitor continuously: Set up a weekly or monthly performance report using tools like Lighthouse CI or SpeedCurve. Don’t rely on a one-time fix.
Link Building: What Works and What Doesn’t
Link building remains a core part of off-page SEO, but the landscape has changed. Google’s algorithms are better at detecting manipulative link patterns—and penalties can be severe. Your agency should focus on earning links through quality content and genuine outreach, not buying links or participating in private blog networks.

What to look for in a link building campaign:
- Relevance over volume: A link from a high-authority site in your industry is often more valuable than many links from random directories. Your agency should target sites that are topically related to your content.
- Content-driven outreach: The best links come from creating something worth linking to—original research, comprehensive guides, or data visualizations. Your agency should produce linkable assets, then pitch them to relevant publishers.
- Backlink profile audits: Before building new links, your agency should audit your existing backlink profile using tools like Majestic or Ahrefs. Identify toxic links (spammy directories, link farms) and disavow them if necessary. A sudden influx of low-quality links can be a risk.
- Natural anchor text distribution: Avoid over-optimized anchor text (e.g., “best SEO agency” in every link). A healthy profile includes branded anchors, naked URLs, and generic phrases like “click here.”
- Risk awareness: Black-hat link building—buying links, using automated tools, or participating in link exchanges—can lead to manual penalties. A reputable agency will never promise “guaranteed first page ranking” or claim “we will never be penalized.” Those are red flags.
| Approach | Risk Level | Potential Reward | Notes |
|---|---|---|---|
| Guest posting on relevant sites | Low to medium | High (if content is valuable) | Requires quality content and outreach effort |
| Broken link building | Low | Medium | Time-intensive but low risk |
| Buying links from link brokers | High | Low (short-term) | High risk of penalty; Google actively targets |
| Private blog networks (PBNs) | Very high | Low (short-term) | Almost certain penalty if detected |
| Content syndication (e.g., Medium) | Low | Low (usually nofollow) | Good for exposure, not direct ranking boost |
How to Brief Your Agency: A Practical Checklist
You don’t need to be an SEO expert to get good work from your agency. But you do need to ask the right questions and set clear expectations. Use this checklist when briefing your agency or evaluating their proposals:
- Define success metrics: Agree on KPIs that matter—organic traffic, keyword rankings for target terms, conversion rate, or revenue. Avoid vague metrics like “brand awareness” without a measurement plan.
- Request a technical audit plan: Ask for a detailed scope of the audit, including which tools they’ll use, what they’ll check (crawl budget, Core Web Vitals, duplicate content, etc.), and how they’ll prioritize fixes.
- Set content strategy expectations: Don’t accept a keyword list without intent mapping. Ask for a content calendar that shows topic clusters, target keywords, and expected publication dates.
- Clarify link building tactics: Require transparency on their link building methods. Ask for examples of recent outreach, sample anchor text distribution, and a plan for disavowing toxic links.
- Establish reporting cadence: Monthly reports should include actionable insights, not just pretty charts. Ask for a summary of what was done, what changed (rankings, traffic, conversions), and what’s planned next.
- Build in a performance review: After 3–6 months, review progress against baseline. If there’s no improvement in organic traffic or keyword rankings, reassess the strategy—or the agency.
Summary: What a Good SEO Agency Delivers
A reliable SEO agency doesn’t promise instant results or guaranteed rankings. Instead, they provide:
- A thorough technical audit that identifies crawl budget issues, duplicate content, and Core Web Vitals problems—with a clear prioritization of fixes.
- On-page optimization that maps content to search intent, uses clean header structures, and optimizes images and internal links.
- Site performance improvements based on field data, not just lab scores, with continuous monitoring.
- Ethical link building through content-driven outreach and careful backlink profile management.
- Transparent reporting that ties activities to business outcomes.
For more on how to evaluate SEO services, check our guide on technical SEO audits and on-page optimization best practices.

Reader Comments (0)