August 22, 2006
A great post by Andy Hagans at SEO Book: 101 Ways to Build Link Popularity in 2006
Take special note of the section “71 Good Ways to Build Links”.
A somewhat overlooked post on Jim Boydkin’s blog: Click Rate for Top 10 Search Results:
Datamining of the AOL data apparently shows some clear figures for clickthrough rates on Top 10 positions:
Total Clicks: 4,926,623
Click Rank1: 2,075,765
Click Rank2: 586,100 = 3.5x less
Click Rank3: 418,643 = 4.9x less
Click Rank4: 298,532 = 6.9x less
Click Rank5: 242,169 = 8.5x less
Click Rank6: 199,541 = 10.4x less
Click Rank7: 168,080 = 12.3x less
Click Rank8: 148,489 = 14.0x less
Click Rank9: 140,356 = 14.8x less
Click Rank10: 147,551 = 14.1x less
Click Rank1: 2,075,765
Click Rank2: 586,100 = 3.5x less than ^
Click Rank3: 418,643 = 1.4x less than ^
Click Rank4: 298,532 = 1.4x less than ^
Click Rank5: 242,169 = 1.2x less than ^
Click Rank6: 199,541 = 1.2x less than ^
Click Rank7: 168,080 = 1.2x less than ^
Click Rank8: 148,489 = 1.1x less than ^
Click Rank9: 140,356 = 1.05x less than ^
Click Rank10: 147,551 = 1.05x more than ^
August 1, 2006
Usually Google does a pretty good job at indexing. Traditionally Google was more interested in capturing as much information from the web as possible.
More recently, it seems as though Google’s policy may have changed - though any existing indexing concerns may be due to Google’s bad data push.
I’ve seen this issue really acutely on a relatively new forum I set up on a relatively established website. So far, Google hasn’t been in at all.
Bad data push, or something restrictive on the website?
Either way, I’m going to do a little experiment here - by listing a couple of the main forum boards, will that help with indexing?
Traditionally it would - but if not, it may well suggest that something new really is at play in terms of Google indexing, which would be a very interesting change indeed.
Here are the boards:
Now let’s test the indexing…
July 13, 2006
Here’s the conundrum - if you find yourself linking to the same site twice on a page, or else linking the same keywords more than once, it can be difficult to tell whether the second or first link carries any value.
For example, if you do either of the following, then what are the actual link benefits?
- Same keyword linked twice on a page, to different URLs
- Same URL linked twice on a page, for different keywords
Here’s a live experiment you can refer to, using two nonense words, and links to different sites.
1. Repeated keyword
If I link for a nonsense word to two different domains, do search engines really credit the keyword to both domains, or is only the first counted?
So I’ll link XGGFLLGLYZ here to Google -
And I’ll link XGGFLLGLYZ here to MSN.
2. Repeated link
If I use the same URL on a page, linked to for two different keywords, then will search engines give equal credit to the URL for both keywords - or simply credit one?
So I’ll link TTXMBBLEWBBLE to Yahoo!
and also link TTXMBBLEYAAKD to Yahoo! as well!
Whatever the results are that come in, it’s always worth remembering that search engines are constantly tweaking their algorithms.
So this is an experiment you may want to keep an eye on in the continuing changes - just to see whether those repeated URLs/keywords are really delivering benefits.
Sometimes you may be considering adding content to a third-party site, not least for link benefits.
Whether it’s a link exchange, forum link drop, blog comment, or article submission, the sites for links are available - but some throw in a nasty surprise by using the “nofollow” tag.
Looking through page code for nofollow is arduous at best - so make it easy on yourself by editing the CSS of Mozilla Firefox so that anything with nofollow is clearly highlighted on a page. :)
Here’s what you need to do:
1. Install Mozilla Firefox, of course. :)
2. Surf to the following file (Windows XP):
\Documents and Settings\
Now add the following to the bottom of that file:
IMPORTANT: Now rename “userContent-example.css” to “userContent.css”.
There - now done. :)
Many SEO’s already know this trick for spotting nofollow links quickly and easily, but if you’re still exploring the waters, or only a small scale link finder, then you should probably find this quite useful. :)
July 12, 2006
If SEO is about attracting targeted visitors, then methods that can legitimately increase your site’s exposure to extra traffic work well within that remit.
One way of doing this is via syndication, and one of the big daddy’s of this is Google News. Any site that can get itself syndicated via Google News can potentially tap into extra traffic streams.
Not everyone can qualify, though, and generally you will want to ensure that any site submitted is generally family friendly, is constantly updated, and is so with articles of real informational value
Another qualifying criteria is that sites run by private individuals will not be syndicated. To be accepted into Google News, you have to ensure that the site is publishing under the auspices of a recognsed corporate body.
Something else to bear in mind - Google *do* keep a record of which sites have been submitted for Google News, so if you find yourself being rejected at first, don’t make a second request for inclusion until you’ve addressed any potential issues that caused it’s rejection in the first place.
On a more technical note, I’ve found that Google News can have problems where the article title is also the link to the title. So if you do have a site accepted for Google News, do watch out for this issue, and edit your site accordingly - ie, unlinked title at top of article, plus permalink at bottom.
Of course, it shouldn’t go without saying that images posted with your articles on your site can help clickthrough - even if your main headline gets buried in the “similar stories” link, your image can still show next to the whole entry.
On that point, it’s worth emphasising that the more original your information, the better - the same old popular news stories rewritten can simply keep you hidden in the “similar stories” list, leading to minimal traffic - while an original story is more likely to capture a swell of new visitors via Google News.
You also need to ensure that you add content regularly - Google News will publish a news item for 30 days, then it is removed from the news results. So constant new content is required to maintain any kind of presence in Google News.
Overall, being syndicated by Google News can bring in the extra traffic - but do note two key pitfalls.
The first is that the type of visitor is usually looking for information quickly, rather than a purchase. So try and find a way to encourage them to remain or revist the site - an “Add to Favourites” link could be invaluable here.
The second is that scrapers can and do republish stories from Google News, so watch out for your content being taken by other sites, not least “Spamsense” sites. Also, some scrapers post directly to spamblogs, so be aware of spam pingbacks being sent if your own site accepts them.
Here’s a summary of the above points:
1. Keep your site updated
2. Keep it clean and fresh
3. Be as original as possible
4. Use images where you can
5. Watch for technical problems
6. Ensure you submit company-administrated sites
7. Don’t resubmit your site after rejection, unless you’ve made real changes
Here’s the submit URL: Submit to Google News
July 8, 2006
Too often prospective clients are looking to capture traffic and sales from just a couple of different major keywords.
Often, these keywords are very similar – singular and plural versions of the same term, for example.
The real secret to working with keywords in links to work with as varied a spread of keywords of possible, in order to capture as many different keyword combinations of these as possible.
Although Wordtracker and Overture keyword checkers may show some keywords as having high volumes of searches, you often need to be cynical about these.
After all, in competitive keyword areas, many searches are performed by vendors checking their own positions – instead of actual prospective customers. And this bloats the traffic volumes.
What Wordtracker and Overture keyword lists often don’t illustrate is that there can be an incredible range of related keyword related searches.
Each of these may have an very small volume – perhaps once a month – but once you start to add these together, they can equate to a HUGE volume of potential traffic, that even beats major keywords.
What’s more, you can often find that the more competitive keywords are relatively generic – good for window shoppers – so the conversion ratio isn’t so great.
However, much more precise searches in the Longtail can convert much better, because you’re capturing search traffic that has already made a purchasing decision.
Last year, a new client who is otherwise “sandboxed” for most of his major keywords switched off his PPC and went on holiday for 2 weeks.
When he came back, there was over £8,000 ($15,000) in orders waiting for him. And this is during a slow season.
Almost all of these orders were generated through a large volume link campaign, which specifically aimed to capture lots of different keyword combinations – links that didn’t simply aim for major keywords, but a whole variety of them.
The Longtail remains a very underestimate SEO strategy, but you ignore it at your own cost.
There’s a lot of general discussion on the need for “on-topic” links.
Google is perceived by some to particularly reward “on-topic” links.
But not every page with a link is going to be devoted to that topic – heck, you only have to consider Google itself.
Consider how many times you see people link to Google – how many of those webpages are specifically about search engines?
Probably not many – but instead about an unrelated subject, where Google is recommended as a resource for finding out more.
Off-topic links are a natural part of the structure of the web, and because of that, are always useful – for human users and search engines.
And the simple truth is that off-topic links work and are effective for SEO campaigns.
While on-topic links are desirable, this can really push up a budget considerably. Frankly for many small businesses to insist on on-topic links only is not simply difficult, it’s unnecessary.
My personal SEO philosophy is not to simply deliver results, but to deliver results in a very cost-effective manner.
After all, would you rather pay $5,000 or $1,000 to generate $10,000 in sales?
Off-topic links can be a very cost-effective way to generate keywords links in volume – and develop corresponding traffic and sales because of it.
Web directories have a variable reputation in SEO.
On the one hand, they are often regarded as little better than link farms. Even worse, they almost certainly provide “low-trust” links.
Even still, I use directory listings as a strategic part of my overall SEO campaign.
This is not because I want to use directory titles to “link bomb” into listings – but as a key part of a spidering strategy.
Having a search engine friendly site requires that search engines can actually find and index it, and web directories are a way to help ensure a stable of static links that search engines can use to find it.
It also helps raise the overall profile of a website online.
Sometimes the page listings can capture key Longtail searches that – if you are listed prominently enough on – can help expose your target company to human users, and even capture qualified sales leads.
When it comes to SEO strategies, a directory-only approach isn’t going to be a big help for competitive business verticals.
However, when enrolled into a wider SEO campaign utilising multiple strategies, I personally consider it to be an invaluable part of the armoury.
July 3, 2006
Far too often I see prospective clients only look to rank on Google.
Sure, Google is the most widely used search engine – but if you develop a strategy that only caters for Google, you may be crippling yourself unnecessarily.
Yahoo! and MSN together can capture almost 50% of the search market, so anyone looking to simply target Google may do so at the expense of other search traffic avenues.
This is particularly in the case of businesses that worry most about getting on-topic links, building links gradually, and only links from high PageRank pages.
Not only is this going to shift costs significantly upwards, it can additionally kneecap traffic and sales from other sources.
The bottom line is that if you have a new domain, and are unlikely to rank well in the short-term for Google due to sandboxing – then forget about Google.
Instead, target Yahoo! and MSN to deliver traffic and sales from these search engines primarily for major targeted keywords, while also capturing a lot of Longtail traffic from Google.
Then, when Google finally starts to relax the filters on your domain, you already have a very strong SEO campaign platform to take advantage of this.
The bottom line is that there is that Google is not the only search engine – and those businesses that realise this, are going to be more success than those that don’t.
The action of Cialis Soft Tabs is such that it breaks all frontiers of blockade in the arteries supplying blood to your penis in the moments of excitement which makes it tumescent. And the cost effectiveness of the Cialis Soft Tabs makes it an ideal choice for you if you are in the wrong side of 50 or experiencing any erectile problems due to any reason. The tablets are cheaper than any other similar drugs available in the market and more efficient as it assures a performance period of 36 hours. It is no wonder that the Cialis Soft Tabs are nicknamed Super Viagra for their proven prowess.
Order Cialis Soft Tabs are available over the counter in any leading medical store near your home without any prescriptions. This will tell you how safe medication these tablets are and also having no harmful side effect which is why they are sold like that openly. So, once you have used Cialis Order, there is no longer need to feel disappointment and frustration as you feel aroused with your partner, and instead, an exhilarating time is promised.
July 2, 2006
One of the more recent “crazes” is to write articles to generate links,
It’s not a new method for SEO – it’s as old as the internet. The difference being, as Google becomes more restrictive on ranking criteria, article submissions have become a latest desperate grab for rankings.
The problem here is that while links across a larger number of websites and IP ranges is welcome for link building, the duplicate content issue means that many SEO’s focused on article writing may be negating their own strategy.
This is especially if we presume that when it comes to duplicated content, Google is not going to allow much “juice” to be sent from such pages.
Even an argument that benefits may still exist with duplicated content, the bottom line is that it’s unwelcome.
To help circumvent this, here’s a couple of tips when it comes to writing and submitting articles to third-party websites:
1. Vary your article titles as much as possible
2. Vary the subheadings. You do use keywords subheading in your article, don’t you? If not, you should look to do so.
3. Vary the text where possible – you can even set up a simple script randomise the central paragraphs, so that many of your article submissions will be slightly but significantly different.
4. Vary resource boxes – the information and links they contain – as much as possible. Also ensure you add at least one good deep content link per resource
5. Submit to fewer article sites – less really can be more. By submitting to the best quality article sites out there, not only can you limit the extent of your duplicated content, but you can also ensure the best sites provide the strongest thrust for your links.
As with other linking methods, article writing and submission isn’t going to be an effective strategy all by itself.
However, I do use it to supplement other key SEO strategies in order to create a more stable platform for a SEO campaign.
And also don’t forget that article submissions timed with new content on your own site, could be a way to help improve how fast that is indexed – a key concern in more competitive markets.
July 1, 2006
The Google Sandbox is an old system.
It was first noticed in April 2004, when volume link builders (such as myself) found that instead of immediate ranking effects from volume link building, as expected, it now took 3 months before those links would impact.
Since then the Google Sandbox has come to encompass a varied range of filters that Google seems to have developed in order to prevent easy manipulation of rankings – with the result that many newer domains have an awful time ranking for major keywords.
If you’re a results-driven person like myself, then it can be initially unnerving. But all is not lost.
For websites with a lot of content (such as informational sites and ecommerce sites) you can still capture a lot of Longtail searches.
I have a client who still is sandboxed for a number of his targeted keywords. However, we capture enough sales from the Longtail traffic to make the SEO campaign more than profitable from him on that basis alone.
However, I’ve recently taken up a new strategy for him to help fight sandboxing.
Instead of just trying to rank his website, I’m now ranking webpages on “trusted” third party sites – on pages dedicated to promoting his products/services.
So far it’s working very well – because the pages are on older and more trusted domains, Google has no problem ranking for those pages for the keywords used in the links.
In just a couple of weeks he captured half of his targeted keywords into Top 10 rankings on this method alone, plus he also has his own site represented in various positions – and all in addition to the capture of sales from Longtail search traffic.
Although ranking third-party sites instead of your targeted website isn’t ideal – not least because it introduces another click between a visitor finding a listing to view, and the destination webpage – it can at least provide a short-term solution to a short-term problem.
June 30, 2006
In April Google attempted to overhaul their datacenter network with a new way of spidering the web, with a specific aim to save on bandwidth and increase efficiency.
The result was 3 months of Google problems, that Google later put down to a “bad data push”.
Key problems included sites disappearing from Google, or only having a few pages listed instead of the proper number of pages.
To many webmasters, this was a serious problem that they have been relieved to see finally rectified for the most part.
However, a loss of site inventory in search isn’t necessarily a bad thing – it can even be an opportunity.
Specifically, to dare to rewrite URLs without having to worry too much about loss of traffic – when the traffic is already lost.
Rewriting URLs can be anything from straightforward to unnervingly complex, depending on how you approach it. A good rewrite solution to make pages search engine friendly should be able to cater most needs.
But oftentimes the main headache is redirecting the old URLs afterwards. Sometimes – on sites with thousands of pages – it requires such a comprehensive set of redirects that often the easiest way to deal with it is to simply apply the rewrite solution and then hope the search engine catch up on displaying the right pages, without losing too many users.
That’s where a “bad data push” can be taken advantage of – if your URLs aren’t exactly how you want them to be, then applying a new rewrite solution to your website is a way to add pain to pain to end up with a satisfactory solution.
Then when Google comes back in with full indexing, it’s your new URLs that it will pick up, having lost your old URLs anyway.
It’s not an ideal situation, of course – no one really wants to lose traffic – but at least there are times when a loss of traffic can be taken advantage of to apply new solutions to old problems – so that once your website recovers, you can better exploit it’s potential for search engine optimisation processes.