Jul 282005
 

AskJeeves is rolling out its keyword-based advertiser program on Monday, August 1, 2005 in the first move to break away from a dependence upon Google ads for revenue. Even though the Ask network receives 70-percent of the revenue from the Google ads displayed upon the AskJeeves pages, the extra 30-percent would be a boon to this underdog search engine.

AskJeeves has a contract with Google until 2007 for the delivery of ads upon their search engine results pages. The new AskJeeves ads, however, until that time will be displayed above the Google ads, generating better click-through rates.

But the AskJeeves and its keyword-based advertising program also has some more ambitious goals. According to the MyWay website, “Jeeves will syndicate its ads onto other sites, including InfoSpace Inc. (INSP)’s (INSP) Dogpile, CNET Networks Inc. (CNET)’s (CNET) Search.com and ValueClick Inc. (VCLK)’s (VCLK) Search123.”

Eyeballing the lucrative search engine ad market that Google and Yahoo have cashed in on, AskJeeves is also expected to start delivering ads to its other IAC/InteractiveCorp-owned websites as well. IAC acquired AskJeeves in March 2005 and owns other popular properties such as Expedia.com, CitySearch.com, Hotels.com, TicketMaster.com and Match.com.

Because the search engine marketing cash cow has now grown into an enormous size, its no wonder that the smaller engines want to mark off their stakes (or steaks) as well.

Jul 262005
 

Google and TopCoder are collaborating to offer Google Code Jam 2005, where the world’s top programmers compete for cash and recognition. With a prize purse at $155,000 (who picked this figure?), 100 participants from around the world will compete in the final rounds by solving programming problems and breaking each others code.

This Code Warrior competition is available in four programming languages – Java, C++, C#, and VB and the winner brings home $10,000 of the total purse. There are four phases to the Google Code Jam 2005 competition: Downloading the Arena, Coding Phase, Challenge Phase and System Tests.

The Arena is a Java Applet that one can download ahead of time to practice on sample problems before the start of competition. In the Coding Phase, participants enter the arena in groups of 10 and race to see who can deliver the most accurate solution in the shortest amount of time.

In the Challenge Phase, programmer try to break each other’s code. Points are given for successful attempts to break another programmer’s code and taken away for failed attempts.

At the end of the Challenge Phase, the System Test phase begins, which is an automated and objective test of the competitors’ work and the awarding of final points to the contestants.

According to Search Engine Journal, “Last year, more than 7,500 participants from more than 100 countries competed in the Google Code Jam. Sergio Sancho of Buenos Aires, Argentina was the winner of the 2004 competition, and Jimmy Mardell of Stockholm, Sweden, took home the grand prize in 2003.”

Google Code Jam 2005 registration begins July 25 and closes on August 19, 2005. Qualification rounds begin on August 22 and the Championship round of the final 100 participants is scheduled for September 23, 2005.

Google Code Jam 2005 is destined to rock the house down. Bring your Frisbee’s, beach balls and lighters to the Googleplex in Mountainview, California this Fall and watch out San Jose Jazz Festival, the Google Code Jam 2005 is coming to town.

Jul 202005
 

Google has just completed its newest Page Rank and Back Links update. Some have gained PR and some have lost it, which is typical. What is not typical is that many sites that have done active link-building in the past 3 months have actually lost many of their Google back links.

What up with that? Well, I have a theory. My theory is that Google is now ignoring text links that appear to be paid text links as far as PR and their back links counter is concerned. For instance, I have a PR6 website that is indexed daily that I have been using to get my new websites listed quickly into Google.

I’ve placed many off-topic text links on the left panel of the homepage. Up until this latest Google update, these websites showed a back link in Google to the PR6 website. Now, however, the Google back links from the new websites have dropped off the map. I’m also having trouble currently getting my new websites listed quickly into Google using this method.

This list of text links on my PR6 website are not paid text links, but to a search engine robot, they would appear to be. This is why I believe now that Google has taken steps to ignore paid text links in their quest to enforce their “natural linking” ideal of how the web should be won.

Jul 182005
 

Many people have lost that lovin’ feelin’ when it comes to the past few Google Page Rank updates. Google itself has decried that Page Rank is for “entertainment purposes only.” It is well-known also that Page Rank is not tied to the search engine results pages (SERP’s), so a PR7 website can be buried in the rankings while a PR3 page may hold the top position.

Still, though a fair number of people continue to get excited every time Google does update its Page Rank in its toolbar. Many companies still use their site’s PR number as a cost basis for offering advertising upon their sites. Others see PR as a good indicator of their link-building efforts for the past 3 months (or lack of efforts).

It appears as if Google is trying to make their Page Rank updates upon a quarterly basis. The last PR update, however, they missed by 6 weeks, while this PR update they only missed their quarterly mark by 2 weeks.

Google’s back link update is also something that many watch to see how well Google is responding to their linking campaigns. Since Google only lists back links with a PR4 or above, this tool has been somewhat suspect from the very beginning. There is a better tool from MarketLeap.com that measures back links across all of the major search engines.

No matter what, though, the Google Page Rank and Back Link update is sure to raise the volume of the chatter at the SEO and Webmaster message boards for the next couple of weeks.

Jul 162005
 

The Wayback Machine is a fun little tool to go back in time and see what old, archived web pages looked like at one time. If you would like to see some of your old web pages or those of a competitors or for a website you’re about to buy, you can check it out on the Wayback Machine.

In an odd legal turn of events, though, the Wayback Machine was used in a lawsuit to prove copyright and now, is itself being sued. According to the New York Times, “The Internet Archive was created in 1996 as the institutional memory of the online world, storing snapshots of ever-changing Web sites and collecting other multimedia artifacts. Now the nonprofit archive is on the defensive in a legal case that represents a strange turn in the debate over copyrights in the digital age.”

Healthcare Advocates is suing the owners of the Wayback Machine under the Digital Millennium Copyright Act and the Computer Fraud and Abuse Act since the Philadelphia law firm, Harding Earley Follmer & Frailey used the Wayback Machine to prove their case against them. Healthcare Advocates is saying that access to their old web pages is “unauthorized and illegal.”

For those who use the Wayback Machine outside the legal profession (such as historians, SEO’s, web businesses and other interested parties, etc.) this has chilling implications for the future of the popular Internet tool. This setback for Wayback hopefully will be resolved in such as way that the Wayback Machine can continue to be used for its original purposes, which is for entertainment purposes only and not as a tool for endless copyright lawsuits which will naturally lead to its demise.

p5rn7vb
Jul 082005
 

There’s a nifty little tool from Copyscape that will help you find other websites that have taken your copy without asking. This copyright tool uses the Google API to query Google to find other websites that have the same text as the URL you input into the search box.

By using this copyright tool, you will be able to see if anyone has taken your copy and take appropriate action, by first asking them to take down your text, then contacting their hosting company or ISP if necessary.

In fact, I have used this tool myself and found that a company from India has copied my entire homepage and placed it on their homepage. I have contacted them about this matter and hopefully the duplicate text will be down soon.

This copyright tool is also great for finding websites that may be impacting your site by delivering to it a duplicate content penalty from Google. It is well known that Google like unique content. It is also well known that if Google finds two websites with identical content it will usually penalize both websites with a duplicate content penalty, burying both sites in the rankings.

If someone takes your copy without asking, it could be because of laziness or maliciousness. If they’re lazy, they are simply taking your copy rather than going to the trouble to write their own. If they’re malicious, then they are intending to bury your website in the ranking with a duplicate content penalty.

No matter what the reason, though it is important to protect your copyrighted material from theft and this little tool will help you do just that. Remember, the tool is free for all to use, so there is no need to copy it and place it on your own website.

Jul 052005
 

There’s a new tool in town that will check the anchor text of your back links within the Yahoo database. Why would anyone want such a tool? If you have a website that has been around a while and have changed the keywords for your website several times (like I have) you may wish to check to see how many different keyword phrases you have out there diluting your latest keyword phrase changing efforts.

For instance, if you have a website currently optimized for the keyword phrase ‘auto repair’ and wish to change it to ‘car repair’ it would be good to know how many instances of ‘auto repair’ you have on different sites so that you can gauge the efforts it will take to rank well, through link-building, for ‘car repair’. This is also true if you have additional anchor text floating around the web optimized for ‘automobile repair’, ‘vehicle repair’, ‘automotive parts’, and ‘Mazda performance kits’.

If you website has too many variations of anchor text for your homepage, then your site may be too diluted in the anchor text arena to rank well for your current keyword phrases. This may just be the reason you’ve been looking for to build a brand new website optimized for your current keyword phrase rather than trying to revamp and older website.

Anyway, the tool is worth checking out as it will give you some additional knowledge about your website that you probably didn’t know before. And knowledge can’t be all that bad, can it?

Jul 012005
 

Google has introduced a new beta product called Google SiteMaps. Google SiteMaps is a method for Webmasters to communicate directly with a Google robot that their website’s content has changed and it needs to be re-indexed.

According to Google, “Google Sitemaps is an experiment in web crawling. Using Sitemaps to inform and direct our crawlers, we hope to expand our coverage of the web and improve the time to inclusion in our index. By placing a Sitemap-formatted file on your webserver, you enable our crawlers to find out what pages are present and which have recently changed, and to crawl your site accordingly.”

As part of Google SiteMaps, Google offers its Sitemap Generator to format a website’s URL’s into an XML formatted file that the robot can easily read. For those who know scripting, customized sitemaps can also be generated.

SEO’s and others have known for a couple of years now the value of placing a sitemap on your website. This has been one method to encourage the robots to crawl your site frequently and give them a one-stop shopping place to get all of the information they need.

Google has taken this one-step further by introducing their own Google SiteMaps, which will spoon-feed their robot the information they need in the format that they require. For impatient Webmasters keeping close track on how often Googlebot crawls their sites, Google SiteMaps will be welcomed with open arms.