From Crawl Budget to Conversion: How a Structured On-Page Overhaul Rescued a Stalled B2B Domain

From Crawl Budget to Conversion: How a Structured On-Page Overhaul Rescued a Stalled B2B Domain

Note: The following case study is a hypothetical, educational scenario constructed to illustrate the principles of modern SEO agency practice. All company names, data points, and performance metrics are fictional. They are used solely for instructional purposes and do not represent actual client results.

The Situation: A Site with Traffic but No Traction

The client, a mid-market B2B software provider in the logistics space, approached SearchScope with a common but frustrating problem. Their website had been online for over three years, had a respectable backlink profile, and was generating a steady flow of organic traffic. Yet, conversion rates were flat, and the pages that ranked were not the pages that sold. A preliminary review of their Google Search Console data and a quick manual crawl revealed a deeper structural issue: the site was technically accessible but strategically disorganized. The core problem was not a lack of content, but a misalignment between technical signals, user intent, and content depth.

The client had previously worked with a generalist marketing agency that focused on volume—publishing blog posts weekly and building links at scale. However, the technical foundation had been neglected. The site suffered from crawl waste, thin content on key commercial pages, and a confusing internal linking structure that diluted authority. The brief was clear: stop the bleeding, fix the foundation, and then build for growth. This required a shift from a "more is better" approach to a precision-based on-page strategy.

Phase One: The Technical Audit – Discovering the Crawl Budget Leak

The first step was a comprehensive technical SEO audit. Using a combination of log file analysis and a crawl of the site, the team identified several critical issues. The most impactful finding was a severe crawl budget problem. The site had over 4,000 URL parameters generated by a legacy filtering system on their product comparison pages. Googlebot was spending a disproportionate amount of its allocated crawl budget on these parameterized, near-duplicate URLs, leaving deeper, high-value service pages under-crawled and unindexed.

IssueDescriptionImpact on Performance
Crawl Budget Waste4,000+ parameterized URLs for product filters; log files showed 60% of crawl requests were for these URLs.Critical pages (pricing, case studies) crawled once every 2-3 weeks; new content took 30+ days to index.
Duplicate ContentProduct category pages and blog tag pages had 80%+ text overlap; no canonical tags were implemented.Search engines confused about which page to rank; link equity split across multiple versions.
Core Web Vitals (LCP)Largest Contentful Paint averaged 4.2 seconds on mobile due to unoptimized hero images and render-blocking JavaScript.High bounce rate for mobile users; potential ranking penalty for page experience.
XML Sitemap & robots.txtSitemap included all parameterized URLs; robots.txt did not block `/filter/` or `/sort/` paths.Crawl instructions were contradictory; Googlebot was sent to low-value pages.

The solution was not a full site rebuild, but a surgical cleanup. The team implemented a new robots.txt configuration to disallow the parameterized filter paths. A new, clean XML sitemap was generated that included only the core 200 pages of the site. For the duplicate content issue, a canonical tag strategy was deployed. Each product category page was assigned a single, authoritative URL, and the tag archive pages were noindexed. Finally, the development team optimized the Largest Contentful Paint by compressing images to WebP format and deferring non-critical JavaScript. Within three weeks of these changes, log file analysis showed a 50% reduction in crawl requests to low-value pages, and the core service pages began to see daily crawl activity.

Phase Two: On-Page Optimization and Intent Mapping

With the technical foundation stabilized, the focus shifted to on-page optimization. The previous content strategy had been keyword-centric without considering search intent. For example, the site had a lengthy guide titled "What is Logistics Software?" that ranked well for the informational query, but the page for "Enterprise Logistics Software Pricing" was a thin, 200-word landing page with no pricing information. This mismatch meant that users arriving from high-intent queries found a page that did not meet their needs, leading to a high bounce rate.

The agency conducted a full keyword research and intent mapping exercise. Every existing page was categorized into one of four intent buckets: Informational, Commercial Investigation, Transactional, and Navigational. The content strategy was then re-aligned. The "What is Logistics Software?" page was kept as a top-of-funnel resource, but it was internally linked to a new, comprehensive "Logistics Software Pricing Guide" that included a feature comparison table, a pricing calculator, and a clear call-to-action for a demo. This was a classic example of content gap analysis: the site had the traffic but not the conversion path.

Intent BucketOriginal Page FocusOptimized Page FocusKey On-Page Changes
Informational"Benefits of Cloud Logistics" (1,500 words, no CTA)"Cloud Logistics Explained" (2,500 words, internal links to product pages)Added H2 subheadings for readability; included a "next step" section linking to the commercial page.
Commercial Investigation"Best Logistics Software 2024" (listicle, no comparison)"Logistics Software Comparison: Top 5 Platforms" (3,000 words, feature table)Added a comparison matrix; included user reviews; optimized meta description for "best logistics software for mid-market."
Transactional"Software Pricing" (200 words, no prices)"Enterprise Pricing Plans & ROI Calculator" (1,800 words, interactive table)Added a clear pricing tier table; included a "request a quote" form; optimized page title for "logistics software pricing."

The content strategy also addressed the site's link building potential. By creating genuinely useful, data-driven content (such as the pricing guide and a new industry benchmark report), the team gave the backlink profile a natural reason to grow. Instead of chasing low-quality directory links, the focus was on earning editorial links from logistics industry publications. The existing backlink profile was strong in terms of Domain Authority, but the Trust Flow was low due to a high number of unrelated links. The new content strategy helped attract relevant, high-trust links, gradually improving the site's overall authority in the niche.

Phase Three: Performance Growth and the Role of Core Web Vitals

The final phase of the engagement focused on performance growth, driven primarily by the Core Web Vitals improvements made in Phase One and the content restructuring in Phase Two. The reduction in crawl waste meant that Google could discover and index the new, high-quality content much faster. The intent mapping ensured that the right pages were ranking for the right queries. The combination of technical health and content relevance created a virtuous cycle.

One of the most significant outcomes was the improvement in the site's Largest Contentful Paint (LCP). By moving from a 4.2-second LCP to under 2.5 seconds (the recommended threshold), the site not only avoided a potential ranking penalty but also saw a measurable improvement in on-page engagement metrics. The bounce rate on mobile traffic dropped by approximately 15% for the key commercial pages. This was not a direct ranking signal in the traditional sense, but it was a strong user experience signal that correlated with improved click-through rates from the search results.

The canonical tag strategy also paid dividends. By consolidating duplicate content signals, the site's primary pages began to accumulate link equity more efficiently. The "Logistics Software Pricing Guide" page, which previously competed with two other similar URLs for the same query, now became the single authoritative source. Within two months of the canonicalization, that page moved from page three to the top of page one for its target keyword.

Lessons Learned and Recommendations

This case demonstrates that effective SEO is not a single tactic but a layered process. The initial technical audit was essential to stop the site from bleeding crawl budget and confusing search engines. Without that foundation, the content strategy would have been wasted on a broken system. Conversely, technical optimization alone would not have generated growth; the site needed a content strategy that understood user intent and provided a clear conversion path.

For any organization considering an SEO agency engagement, the key takeaway is to insist on a data-driven technical audit as the first step. Ask for log file analysis, not just a crawl report. Ensure that the agency can explain how they will manage crawl budget and address Core Web Vitals. And critically, demand a content strategy that goes beyond keyword stuffing to include intent mapping and internal linking architecture. The best SEO is invisible—it makes the site faster, more useful, and easier to navigate for both users and search engines. The results are not guaranteed, but the process is replicable.

For further reading on how to build a robust technical foundation, see our guide on technical SEO audits. To understand how to structure your content for maximum impact, review our content strategy framework. And if you are evaluating your own site's health, our Core Web Vitals checklist is a good starting point.

Russell Le

Russell Le

Senior SEO Analyst

Marcus specializes in data-driven SEO strategy and competitive analysis. He helps businesses align search performance with business goals.

Reader Comments (0)

Leave a comment