In the digital marketing world, it’s easy to think of Google as the all-seeing, all-knowing Wizard of Oz, eyeing our every mobile-optimized move from behind a purple curtain. Since our rankings live or die by the whim of Google’s web crawlers, it’s tempting to see them as the infallible arbiters of SEO value. But that’s not true — if Google’s process were perfect, their search algorithm wouldn’t see near-daily updates.
In fact, SEO is a partnership between Google’s web crawlers and the sites they crawl. If we don’t do our job to maintain well-optimized, crawlable sites, the whole operation falls apart. That’s why site health is so critical: as DeepCrawl’s Head of Product Alec Bertram says, “If search engines can’t see your content, all the good stuff is kind of pointless.” The web crawler is an ally, not a judge.
“If search engines can’t see your content, all the good stuff is kind of pointless.”
Pictured: not web crawlers.
Your site may look nice on the surface, but without a comprehensive site crawling, it’s impossible to tell what web crawlers will and won’t see. Does an image have multiple thumbnail sizes? Can you disable user-generated content? Is an item in your retail catalogue now out of stock? As complexity increases, the opportunity for a performance-breaking bug grows exponentially.
So what’s a forward-thinking marketer to do? The solution is to schedule regular site health check-ups. There are numerous options for this, but the one we recommend is DeepCrawl, definitively the industry leader — if we didn’t think they were the best, we wouldn’t integrate their data with our platform.
In addition to a host of services they provide, like website migrations and helping sites recover from Panda or Penguin issues, DeepCrawl offers the most comprehensive site crawling service in digital marketing. You can schedule crawls to run on an automated basis that’s convenient for you, scaling from hourly to monthly.
Easily identify any orphaned pages that may need your attention.
You can also see at a glance what’s changed since your last crawl and track your progress (or regressions) over time with historical data. This long-term, insights-over-data approach is the difference between frantically reacting to drops in performance and commanding a total understanding of how your site is functioning.
This strategy is vitally important and works extremely well — up to a point. What if you need more advanced ways to dig into site health unique to your business; say, managing web crawler visibility over thousands of pages across multiple domains and brands?
This enterprise-scale problem is exactly why Conductor built the FlexHub, a collection of marketing business intelligence tools meant to help marketers operate at the global scale. The FlexHub’s Site Health heat maps give an at-a-glance view of critical site health issues across multiple domains, powered by our DeepCrawl integration, so urgent problems are always on view.
Get a high-level breakdown of your pages to discover any problems you might need to address.
Site crawlability can seem opaque at times, but when well-managed, it’s a major boost to your traffic — and revenue. When it comes to your site health, carpe that diem and take the initiative to bump yourself to the top of the spiders’ list. It pays off big time in the long run.