June 30, 2006
In April Google attempted to overhaul their datacenter network with a new way of spidering the web, with a specific aim to save on bandwidth and increase efficiency.
The result was 3 months of Google problems, that Google later put down to a “bad data push”.
Key problems included sites disappearing from Google, or only having a few pages listed instead of the proper number of pages.
To many webmasters, this was a serious problem that they have been relieved to see finally rectified for the most part.
However, a loss of site inventory in search isn’t necessarily a bad thing – it can even be an opportunity.
Specifically, to dare to rewrite URLs without having to worry too much about loss of traffic – when the traffic is already lost.
Rewriting URLs can be anything from straightforward to unnervingly complex, depending on how you approach it. A good rewrite solution to make pages search engine friendly should be able to cater most needs.
But oftentimes the main headache is redirecting the old URLs afterwards. Sometimes – on sites with thousands of pages – it requires such a comprehensive set of redirects that often the easiest way to deal with it is to simply apply the rewrite solution and then hope the search engine catch up on displaying the right pages, without losing too many users.
That’s where a “bad data push” can be taken advantage of – if your URLs aren’t exactly how you want them to be, then applying a new rewrite solution to your website is a way to add pain to pain to end up with a satisfactory solution.
Then when Google comes back in with full indexing, it’s your new URLs that it will pick up, having lost your old URLs anyway.
It’s not an ideal situation, of course – no one really wants to lose traffic – but at least there are times when a loss of traffic can be taken advantage of to apply new solutions to old problems – so that once your website recovers, you can better exploit it’s potential for search engine optimisation processes.