You may or may not have heard of the term “Crawl Budget”. A crawl budget is how many pages Google will crawl on your website on any given day.
Just think. There are billions of websites out there competing for Google’s attention, but Google only spends a minimal amount of time on each of those websites. When it comes to your website you want to make it’s as easy as possible for the Google bot (or Google spider, whatever you like to call it) to crawl through your content and eat up as much information as possible. If it only gets limited time, then, it only knows a tiny bit about your website and will have trouble ranking you.
What does a google bot do?
A Google bot crawls the content of your website’s pages to get an understanding of what your website is about.
The Google bot crawls for:
- Updated relevant content with common search keywords;
- External links (links going to websites outside your own); and
- Internal links (links within your own website).
This crawling leads to your Google Rank – how easily you will appear in Google searches.
How to make the most of the google bot’s visit
There are plenty of things you can do to make the most of the bot’s visit and achieve the best possible rank.
Reduce website errors
When the Google bot finds an error, it’s like hitting a brick wall at a dead end. That error could be what tells the bot it’s time to move onto the next website.
Use your Google Search Console to search for critical errors such as dead links and slow load times.
Help the bot focus on the important stuff.
We don’t want the Google bot to waste time crawling through parts of your website that don’t require indexing. A classic example of this is your shopping cart or checkout page.
Ensure that non-critical and private pages are set up to be inaccessible to the Google bot. Speak to your web developer about ways to do this.
Reduce your redirects
Redirects certainly have their place and can be incredibly useful. You may have an old piece of content you’re retiring but don’t want to remove completely, so a redirect would tell the bot that there’s a similar piece of content it could look at instead.
That said, you want to limit the number of redirects on your website because, for each redirect, the bot has to crawl two pages, which reduces the time spent crawling quality content.
Use internal links
Internal linking is critical and is an essential part of your page’s architecture. Internal links are the gateways to each section of your website, and you want to make the ride from one side of your website to the other really smooth for the bot (and your audience!).
Don’t let any of your pages become an orphan page!
Increase your site speed
Chances are, your homepage doesn’t contain all the vital information about your business, so if the loading time of the page is slow, you’re really just wasting a golden opportunity.
Check for errors that slow your page, including oversized images, and optimise.
Look at your website architecture.
Make sure the structure of your website is nice and flat. This is not only for the google bot but also your general audience.
You don’t want them to click endless links within links, drilling down through a million pages. Eventually, the bot, or your audience member, will run out of time, patience, and interest! There should be a maximum of two clicks to get to the information that they need.
Maximise that crawl budget
Take the time to closely scrutinise your website and modify it to be as easy and seamless as possible to get from one side of your website to the other. This will increase your google ranking chances – the smoother and quicker the experience, the more information both the bot and your audience can devour.