Dec 2, 2011

5 Ways to Prime the Social Pump



You've finally finished that epic blog post [infographic, LOLCat, Thundercats remix…] and tomorrow morning you’ll unleash it on the world. So, what should you do between now and then? You could take a nap, sure, or you could start priming the social pump early. Here are 5 tips for how to put your network to work before you need it…

i. Be Genuine

This is the pre-tip that makes all the other tips work. I honestly hate giving social media advice, because I find that just about every “Always do…” or “NEVER do…” has an exception. There are people who can Tweet out the same link 10 times a day and see great returns. There are others who can talk about nothing but what they eat for breakfast and get 10,000 happy followers.
What’s the difference? Sincerity, and a little moderation. If you’re genuine, believe in what you’re doing, and aren’t just trying to game the system, people will forgive the occasional over-indulgence. Just like we all deserve to eat a bit too much for the holidays, we’re all allowed to get carried away when we’re passionate about something we’ve created. Just do it because you mean it, and try not to overdo it.

Nov 19, 2011

Is Google Too Big To Fail?

We are better off if we ignore what Google is saying and follow one thing: Google wants more money for Google. When we make this assumption, everything Google does makes sense. Deception and doublespeak are logical and expected rather than shocking and upsetting.

When it comes to scale, as pointed out with Groupon, all of these rules go out the window. If you look at the biggest advertisers, replace their account with one with no history and the brand "Geico" with "SEOBook auto insurance" and the campaign will simply not run. You are spam. In some cases larger advertisers are able to run ads which are clearly deceptive and go against guidelines which they actively enforce on smaller advertisers. I have a strong suspicion now that this is in fact institutionalized in Google's rating process rather than any employee going out of their way to overturn some sort of penalty.

Oct 29, 2011

How Big is Your Long Tail Keyword?

Choosing keywords to optimize for is a tricky business, made all the more tricky as keyphrases grow longer than a couple of words. As Google has said, up to 20% of search queries in any given day are completely unique. Should you try to optimize your tauntaun sleeping bags product page for "tauntaun sleeping bag," for "childrens' tauntaun sleeping bag," or for "childrens' star wars tauntaun sleeping bag from hoth"? How can you research whether or not to optimize for such a long tail query?

This week we're asking the question: How big is your long tail? No innuendo intended. This is a totally serious question for the search world, wink wink, nod nod, say no more.

Many of you are familiar with the fact that the world of search is really dominated by this concept of the long tail. Google talks about this incredible metric that 20% of any search that's performed every day is completely unique. Google has never seen that search before performed on their engine at all. No one in history has ever made that search. That happens on one out of every five queries every single day.

Oct 19, 2011

Manual Labor in the SERPs


Although Google explicitly denies it and Yahoo! obviously uses it, in the search research world, it has been suggested many times that search engine use manual approvals and rankings of sites for the top 2-5000 queries. These queries make up between 20-35% of all queries done in a commercial search engine and by manually ranking pages for the top 10 returned results, search engines can shave significant burdens from the server load. It's no surprise, too, that search researchers believe that these manual rankings will also improve the quality and perception of quality of results for common searches.
For obvious reasons, SEOs are fearful of this shift, but it is, in fact a boon to the industry as a whole over the long term. Imagine being manually ranked in the top 10 for an exceptionally popular search term. The only way you can lose rankings is if the quality of your site/result deteriorates in comparison to the competition. Instead of link-building (which I personally find boring & distasteful), our jobs would be primarily about building the best, unique content available for the subject. That sounds like a switch I'd be happy to make.

" The Philosophy of Ranking First "



The Philosophy of Ranking First

A short post from yellowwing, an SEW member from up north in Winnipeg, had this to say today:

I've taken the philospohy of SEO as 90% Linguistics and 10% Math. Beat the competition, not the search engine. It is much easier to analyze other sites than to reverse engineer what the vast team of Google Phd's (sic) have come up with lately.
 
This brings up an interesting perspective about the two ways that SEOs approach their dilemna. In my experience, simply beating the competition (at least at Google) is not enough these days to rank at the top. You must obliterate the competition, in terms of quality and quantity of both content and links.
This could be due to the much maligned 'sandbox' that Google has many sites in, or it could be simply a matter of getting enough attention. In either case, the same hard effort is neccessary to compete.

Oct 18, 2011

Google Link: Command - Busting the Myths

I am NOT a fan of the Google link command, and I'm shocked by the number of folks who operate in and around the SEO, webdev and technology industries who haven't realized this. 

Here's what Google themselves have to say on the matter:
You can perform a Google search using the link: operator to find a sampling of links to any site. For instance, [link:www.google.com] will list web pages that have links pointing to the Google home page. Note there can be no space between the "link:" and the web page URL.

To see a much larger sampling of links to any verified site in Webmaster Tools:
  1. On the Webmaster Tools Home page, click the site you want.
  2. Under Your site on the web, click Links to your site.
Note: Not all links to your site may be listed. This is normal.
The short answer is that historically, we only had room for a very small percentage of back-links because web search was the main part and we didn't have a ton of servers for link colon queries and so, we have doubled or increased the amount of back-links that we show over time for link colon, but it is still a sub-sample. It's a relatively small percentage. And I think that that's a pretty good balance, because if you just automatically show a ton of back-links for any website then spammers or competitors can use that to try to reverse engineer someone's rankings.
Google themselves is telling us not to pay too much attention to the link command, but that doesn't seem to be stopping folks. Let the myth busting commence.

Oct 16, 2011

SEO For The Big Three (Google, Bing, Yahoo) !!

Ranking your website highly on one of the "big three" search engines (Google, Yahoo or MSN) is a daunting task let alone ranking your website highly on all three. Three engines, three algorithms, three different sets of rules - and yet there are websites out there that have first page rankings across them all – how do they do it?

While all of the major search engines use different algorithms the end goal of all three is the same: to provide the searcher with the most relevant results available. It is this one common thread that makes it possible for an SEO to rank a website highly across all the major engines. While there are a variety of factors at play and an even wider variation in the weight each of these factors are given – the possible variations that can produce relevant results are limited.

For example, if inbound links are given 0% weight then insignificant sites will ranki highly for high-competition phrases. Many reputable companies such as Microsoft could lose rankings for their own names so links must and will always hold value. On the other hand, if links were to hold 100% weight then sp@mming the search engines would be a simple matter and so there are a limited number of possible variables in between these extremes that this factor can have, no matter which engine we are optimizing for.

Solid SEO Through De-Optimization !!

That's right, today we aren't going to so much discuss optimization as it's antithesis. Some may wonder what sense this makes. How can one say that the road to higher rankings is built on trying not to rank? In fact, the effort is always to rank highly, it's just the tactics that are a bit different.

What Is De-Optimization?
De-Optimization is the reduction of those tell-tale signs of SEO that once-upon-a-time worked very well and only recently have come to be viewed as blatant attempts at, well, ranking highly. To properly de-optimize a website the following areas need to be addressed:

  • Keyword density
  • Back-link anchor text
  • The use of special text
  • Site relevancy 
With these areas addressed properly a site stands a much higher chance of ranking for the phrases being targeted and perhaps more importantly, holding those rankings over time.