Dec 292005
 

Well, I don’t usually report on this sort of thing, but there has just been announced a new SEO contest in the works similar to the “nigritude ultramarine” contest a few years back. The new keyword phrase will be announced on January 15, 2005 at noon, PST.

This new SEO contest targets Google and the winner will receive $1000 in cash, with those in second, third, fourth and fifth places also receiving lesser amounts of cash.

The Nigritude Ultramarine contest proved back in 2003 that the website with the most back links won. At this time, it showed skeptics that off-page optimization for Google was more important than on-page optimization.

Since, 2006 is a new year, and many changes have taken place Google’s algorithm since back in 2003, it will be interesting to see if the “one with the most toys wins” maxim still holds true in regards to back links being the most important part of SEO.

The contest ends May 15, 2006 with the website with the highest Google rankings, for the announced keyword phrase, winning. Anyway, it should be a fun contest, worth following and there are bound to be many SEO lessons learned along the way.

m4s0n501
Dec 252005
 

Merry Christmas, Happy Holidays, Happy Hanukkah and a most Merry Kwanza! I hope it has been a good year for everyone from an SEO and Internet marketing perspective. My wishes are that for 2006 we all find the prosperity we are all looking for online and continue to build upon what we’ve learned in 2005. Perhaps this will have been the best gift we could have received this Holiday season!

Dec 152005
 

There is a debate raging right now about whether or not publishers should or should not have sponsored links on their websites. Jeremy Zawodny of Yahoo gives sponsored links the thumbs up and Matt Cutts of Google gives them the thumbs down.

Many publishers feel that they have the right to monetize their own websites with any means possible and this includes sponsored links. The real debate seems to be around whether or not to use the ‘no follow’ tag on the sponsored links in order to pass on a little link love to the advertisers.

Zawodny says he screens his advertisers and only links to high quality websites and neither his website nor this advertisers should be penalized in the search engine rankings. Cutts, on the other hand, says that the ‘no follow’ tag is needed to ensure publishers don’t link to spammy sites or websites with little value to visitors.

It’s a great debate and is similar to the ‘bad neighborhood’ linking debate that arose a couple of years back that has helped many webmaster be more aware of just whom they are linking to and why this matters. Right now, the safest method for Google’s sake is to use the ‘no follow’ tag for sponsored links (not for reciprocal links as your partners will surely get mad at you) to ensure your site receives decent rankings in the SERP’s.

Of course tomorrow may be a different story, so stay tuned.

Dec 082005
 

When Yahoo lost the bid to acquire Voice Over IP service provider Skype to online auctioneer, eBay, it decided to taken on Skype head on.

Yahoo has decided to add on some Skype-like services to its Yahoo Messenger service. Like Skype, Yahoo Messenger already offers free worldwide PC-to-PC calling.

Yahoo has decided to add a new paid service to its offerings. According to a Yahoo press release, “Yahoo’s new ‘Phone Out’ option enables users to call regular and mobile phones for one cent per minute in the United States and two cents a minute to about 30 other countries, including calls to Argentina, Australia, China, France, Germany, Italy, Japan and Korea.” Users can also sign up to receive unlimited calls from anywhere for about $30 per year.

By comparison, Skype is offering PC-to-phone calls for 2.3 cents per minute and a SkypeIn phone number costs about $35 per year. By not acquiring Skype in the bidding war with eBay, Yahoo saved itself $2.6 billion last October.

One has to think that this savings has been passed along to the Yahoo customers, in turn, saving them a pretty penny. And a penny saved, is, uh, well you know the rest.

Dec 012005
 

I’ve heard a lot of discussion recently on whether there is indeed a duplicate content penalty within the search engines, particularly Google. The short answer for me is yes, under some circumstances.

Some critics have pointed out that the duplicate content penalty is merely a myth and that a search engine such as Google wouldn’t have the horse power or inclination to check every page in its index against every other page for duplications.

However, if you try out a free service like CopyScape.com you’ll certainly see that the technology exists to scan for duplicate content within the Google index. Also, if you’ve followed the search engines for the past several years you’ll know that the search engines have employed penalties for mirrored sites and redirects (outside of the 301 redirects), which in my mind shows that the inclination to penalize duplicate content pages is there.

What is more convincing to me, however is my own personal experience resurrecting several customer web pages from search engine rankings oblivion over the past year. Simply by changing the text on a single web page that I knew was duplicated elsewhere would be enough to see dramatic results within days. These web pages I knew to be duplicated elsewhere either from my own searches or from the customer.

Earlier I stated that this duplicate content penalty applies only in some circumstances. The circumstances I am referring to are two web pages that are nearly identical textually. What I do know is that, for instance, my own homepage for my website has been pilfered by someone else (see CopyScape) but because there is other text on this pilfered page besides the entirety of my homepage it doesn’t seem to affect my rankings. So, it seems that a certain percentage (that I haven’t come up with yet) of duplicate to non-duplicate text is the threshold for the penalty.

Anyway, my conclusion is that a duplicate content penalty does apply for identical or nearly identical pages. However, its anyone’s guess as to what the threshold percentage is for duplicate versus non-duplicate text in order to avoid a duplicate content penalty between web pages.