How to Fact-Check Your SEO Agency’s On-Page & Content Optimization Work
You’ve hired an SEO services agency. They’ve promised technical audits, on-page optimization, and performance-driven growth. But three months in, your organic traffic hasn’t budged, and the reports are full of jargon you can’t verify. Before you fire them or double down, you need a systematic way to fact-check their work. This checklist will help you separate genuine technical expertise from smoke and mirrors—without needing to become an SEO specialist overnight.
Step 1: Verify the Technical SEO Audit Was More Than a Screenshot
A proper technical SEO audit isn’t a one-page PDF with a red “critical issues” badge. It’s a deep dive into how search engines crawl, render, and index your site. Start by asking for the raw crawl data—not just the summary. A reputable agency will share their crawl report from tools like Screaming Frog, Sitebulb, or DeepCrawl, showing exactly which URLs were crawled, how many returned 4xx or 5xx status codes, and which pages were blocked by robots.txt.
Here’s what to check:
| Audit Component | What to Look For | Red Flag |
|---|---|---|
| Crawl budget analysis | Evidence of crawl waste (e.g., infinite parameter URLs, session IDs) | No mention of crawl budget or only talks about “increasing crawl rate” |
| robots.txt validation | Screenshot of actual file + list of disallowed paths that matter | Generic “robots.txt is fine” without checking for accidental blocking of key pages |
| XML sitemap health | Sitemap includes only indexable, canonical URLs; lastmod dates are accurate | Sitemap contains 301 redirects, broken links, or noindex pages |
| Canonical tag implementation | Every page has a self-referencing canonical or a precise cross-domain one | Canonical tags point to homepage on category pages, or are missing entirely |
| Duplicate content detection | List of duplicate or near-duplicate pages with a remediation plan | Claims “no duplicate content” without running a content similarity check |
If the agency says “your crawl budget is fine” without showing you the data, push back. Crawl budget matters most for large sites, but even smaller sites benefit from understanding how Googlebot allocates resources. A real audit will also include Core Web Vitals data from Google Search Console or CrUX—not just Lighthouse scores on a single page.
Step 2: Validate Core Web Vitals Improvements with Real User Data
Core Web Vitals are a set of metrics measuring real-world user experience: Largest Contentful Paint (LCP) for loading speed, First Input Delay (FID) or Interaction to Next Paint (INP) for interactivity, and Cumulative Layout Shift (CLS) for visual stability. Many agencies will show you a Lighthouse score of 95 and call it done. That’s misleading.
Lighthouse is a lab test—it simulates a single device and network condition. Real user data comes from the Chrome User Experience Report (CrUX) and Google Search Console’s Core Web Vitals report. Ask your agency to show you the CrUX data for your site’s origin, broken down by URL group. If they can’t pull this report, they haven’t verified their fixes actually worked in the wild.
What to fact-check:
- LCP improvements: Did they reduce server response time, optimize images, or lazy-load below-the-fold content? If they only added a plugin, test the page with WebPageTest on a 3G connection.
- CLS fixes: Did they set explicit width/height on images and ads, or reserve space for dynamic content? A common mistake is fixing CLS on desktop but ignoring mobile.
- INP optimization: This metric measures responsiveness to user interactions. If the agency only optimized LCP and CLS, they missed a key piece. INP improvements often require breaking up long JavaScript tasks.

Step 3: Scrutinize the On-Page Optimization Strategy
On-page optimization isn’t just stuffing keywords into title tags and meta descriptions. A competent agency will start with keyword research and intent mapping—understanding whether a search term is informational, navigational, or transactional—before touching a single page. If they optimized your “best running shoes” page with a transactional intent for a keyword that’s mostly informational, you’ll get traffic that bounces immediately.
Here’s a quick checklist for on-page work:
- Title tags: Are they unique per page, under 60 characters, and include the primary keyword naturally? Watch for keyword stuffing or generic titles like “Home | Company Name.”
- Meta descriptions: Are they written for click-through rate, not just keyword inclusion? A good meta description answers the search intent and includes a call to action.
- Header structure: Does each page have a single H1 that matches the topic? Are H2s used to break up sections logically? Avoid agencies that put keywords in every H tag regardless of relevance.
- Content quality: Did they rewrite existing content or just add keyword density? Look for thin content (under 300 words) on pages that should be comprehensive.
- Internal linking: Are they linking to relevant pages within your site using descriptive anchor text? A common shortcut is linking to the homepage from every page—that’s not internal linking, it’s a navigation crutch.
Step 4: Question the Link Building Tactics
Link building is where many agencies cut corners—and where you can get hit with a Google penalty if they use black-hat techniques. Your agency should be transparent about their outreach methods. If they say “we build links through guest posting” but can’t show you the actual guest posts or the sites they’re published on, be suspicious.
What to ask:
- Source of backlinks: Are they from relevant, authoritative sites in your industry? A backlink from a random directory or a link farm will do more harm than good.
- Link profile growth: Is it gradual and natural, or did you suddenly get a large number of new links in a short period? Unnatural spikes are a red flag for automated link building.
- Domain Authority and Trust Flow: These metrics aren’t perfect, but they help gauge link quality. If your agency is building links from sites with low authority, question the value.
- Nofollow vs. dofollow: A healthy link profile includes a mix. If all new links are dofollow, that may appear unnatural.
Step 5: Examine the Redirect and URL Structure Changes
When an agency performs a technical SEO audit, they often recommend URL changes—consolidating duplicate pages, fixing broken links, or restructuring directories. These changes must be handled with 301 redirects, not 302s (temporary) or 307s (redirects that may not pass link equity). A single wrong redirect can tank your traffic for weeks.
Fact-check these items:

| Redirect Type | What It Means | Common Mistake |
|---|---|---|
| 301 (permanent) | Passes most link equity; tells search engines the page moved permanently | Using 302 for permanent moves, which may not transfer authority |
| 302 (temporary) | Does not pass link equity in most cases; for short-term moves | Leaving 302s in place for months, causing loss of ranking power |
| 307 (temporary, HTTP) | Similar to 302 but for HTTP/1.1; rarely used for SEO | Using 307 when 301 is appropriate |
| 404 (not found) | Page no longer exists; should be redirected or given a custom 410 | Leaving high-traffic pages as 404 without redirecting to a relevant replacement |
Ask your agency for a redirect map—a spreadsheet showing old URLs, new URLs, and the redirect type. If they can’t provide this, they’re likely making changes without a systematic approach. Also, check that they’ve updated your XML sitemap to reflect the new URL structure. A sitemap full of 301-redirected URLs is a waste of crawl budget.
Step 6: Review the Reporting Cadence and Metrics
A good SEO agency will report monthly, but more importantly, they’ll report on metrics that tie to business goals—not vanity metrics like “impressions” or “keyword rankings” for terms no one searches. Your report should include:
- Organic traffic to key pages: Not just total sessions, but traffic to pages you care about (product pages, blog posts with lead magnets).
- Conversion rate from organic traffic: Are visitors taking the desired action? If traffic is up but conversions are flat, something is off.
- Core Web Vitals progress: Month-over-month improvement in CrUX data for LCP, CLS, and INP.
- Backlink profile changes: New links acquired, lost links, and authority shifts.
- Crawl and index status: How many pages are indexed, any new crawl errors, and changes in crawl rate.
Step 7: Test Their Understanding of Your Business
Finally, the best way to fact-check an SEO agency is to ask them to explain their strategy in plain language. If they can’t articulate why they’re optimizing a particular page or building links from a specific site, they’re probably following a template, not a custom strategy.
Schedule a 30-minute call and ask:
- “Why did you choose these keywords for our homepage?”
- “How does this content strategy support our sales funnel?”
- “What’s the biggest technical issue you found, and how did you fix it?”
- “If we stopped working together tomorrow, what would we need to maintain?”
Final Checklist for Fact-Checking Your SEO Agency
Use this as a quick reference before your next monthly review:
- Raw crawl data shared (Screaming Frog or equivalent)
- Core Web Vitals report from CrUX, not just Lighthouse
- Keyword intent mapping documented for each optimized page
- Unique title tags and meta descriptions per page
- Internal linking strategy with descriptive anchor text
- Link building sources verified (guest posts, outreach emails)
- Redirect map with 301 codes for all URL changes
- Monthly report with conversion metrics, not just traffic
- Agency can explain strategy in business terms, not SEO jargon
For more guidance, check out our guides on technical SEO audits and content strategy planning.

Reader Comments (0)