Blog

King + Queen = Royal Success

Ever wonder how or why your competitor gets better search engine rankings than you do? Does he know something you don’t? Well, maybe he actually does…And that’s where I come in. I’m here to unveil the secret tricks of the trade that may be giving your competitor that much-needed edge in search engine rankings and to show you how you can reclaim your position without skipping a beat. Quite simply, I am going to give you the key to unlocking the secret chest of information that your competition is using to get better rankings.

It is not that difficult. There are two main criteria to getting good rankings:
1. Content (which is king)
2. Linking (which is queen)

Content is King

Content is easily viewable and just as easy to measure. It basically includes what you see on the site: the text. If your competition has more text than you, consider increasing yours; if he has more keywords, consider increasing yours; if he has bigger headers, consider increasing yours. By all means, I am not saying to make a carbon copy of his site, but do look at the “on the site” factors that you are lacking and evaluate whether to increase them on your site. A note of caution – make sure that any changes you make actually improves the site and increases its value for your customers. In other words, don’t sacrifice quality for quantity unless you believe that the added text actually serves to better the overall look, feel and quality of your site.

Some ideas to increase quality/quantity of content:

  • Write a monthly article about your topic
  • Write a page about your keyphrase/product/service
  • If your headers don’t have your keyphrase in them, then be sure to add it
  • Make sure the titles on all your pages relate to the content of the page
  • Write, write, write (and then write some more!).

Linking is Queen

Linking requires more research. Look up your competitions site on Google by typing in: link: http://www.competitorsdomain.com/ into the search box (replace competitorsdomain with the URL of your competitor). Now the search results you will be looking at is a list of all the sites that are linking to your competitor. Go through that list, and for each site, figure out how he got that link. Some links might be paid for, some might be link exchanges, some might be directories like dmoz.org or Yahoo, and others might be references or testimonials. Make a spreadsheet with each site linking to your competition, and jot down how he got the link. Then, for each site that is appropriate, get it to link to you. Remember to stay away from link farms and low quality links. Link farms are sites that have nothing but links. Low quality links are sites with little to no content. These sites will often try to exchange links with you: Don’t do it! Only link to a site if it will enhance the user experience of your clients.

For more about linking, see my article entitled Linking is Queen
(https://www.redcarpetweb.com/promotion/0409.html#feature)

Royal Success

Getting links takes time and patience, and writing is not usually done overnight. To top it off, once these projects are done, you then have to wait for the search engines to discover and re-evaluate your site. Both the link campaign and the writing projects should be something done on a regular basis, not as a one-shot injection. Try to create new pages every month, and to get some links every month. That way your site will make the gradual climb to the top. And who doesn’t want that?

Shawn Campbell

Shawn Campbell is the co-founder and Chief Search Engine Optimizer at Red Carpet Web Promotion, Inc.
www.redcarpetweb.com

The Road to Better results

A lot has changed in the way sites are optimized for search engines since last year. For one thing, Google is not the only search engine worth looking into anymore; Yahoo has definitely managed to take away some of Google’s oomph over the past twelve months. Another important change is that the intelligence of the search engine spiders and algorithms has increased dramatically. So without further ado, I will present you with a standard search engine optimizing process as done by Red Carpet Web Promotion.

Keyword Research

Nothing can be done until you know what your target phrases are. Keyword research must be done to find out what people are actually typing into the search engines. For example, do they type in “medical insurance” or “health insurance” more often? Is it worth targeting the keyword “dental insurance”? What do your competitors think its clients type?

Keyword research usually begins by asking the client what they think are good keywords and by looking at your competitor’s Meta tags and text. You then have to brainstorm to find new and related keywords that were not previously thought of. The use of Wordtracker, Overture, and Google AdWords’ estimates is indispensable. If you use the “KEI” offered at Wordtracker, don’t fall into the trap of giving it too much worth. It is a good tool to help discover keywords that have not been exploited by the competition, but the really important number is the amount of traffic each keyword generates. Finally, create a chart to determine the relationship between keywords used. For example, there is no point promoting dental insurance if your site does not offer it.

Texts

The next step is to write the text. Hire a specialized writer to put the text together. Ideally someone who has been trained in Internet writing, Internet marketing, and search engine optimization (SEO), or get advice from professional SEOs, marketing experts, and usability experts. Work with the client to get a feel for what is needed for the site. Then use all these skill to put together the delicate balance needed between selling to people, selling to search engines, and making the text interesting/useful to read.

Domain Selection

Once the text is written, come up with a catchy domain name for the site. Try to include part of the keyword in the domain, and to think ahead so that the domain can be expanded into the title. Our site www.gloriousbahamas.com is a good example of a domain with a keyword in it that is catchy and clearly stated. The keyword for that site is “Bahamas real estate”, so having part of the keyword in the domain will help in the long run.

Title and Meta Tags

From the domain name, you can then create a title with the full main keyword in it (such as Glorious Bahamas Real Estate). The title is the most important text on the site. The Meta tags include the description tag, and the keyword tag. The description is what the searchers will see in many search engine results, so it must have the keywords in it and, more importantly, it must sell the site. Write a description that is objective, not subjective. Zeal has some good advice for titles and especially description writing at http://zeal.com/guidelines/style/site_titledesc. The keyword tag is done just in case some engines still use it (though very few still do), so don’t pull your hair out over it. Just list 10-15 keyphrases and try not to repeat any single word more than three times.

New Content

Now we come to the meat of today’s search engine optimization. So far, we have not discussed anything new or original. It is the same strategies that have been used since I first got into the business of SEO in 1998. Today, with smarter engines, a site needs to be something that is cared about. A site has to grow, develop, and expand as if it were someone’s baby. Gone are the days when you could build a site, get good listings, and then forget about it as it brought in the traffic and the dough. Take care of your site by adding useful content to it on a regular basis, and then the site will gradually grow from a few pages to dozens of pages. Not only will this make the site seem more alive – radiating with the healthy glow of a developing child – but it has the added benefit of increasing the amount of content the site contains, and thus increasing the amount of keywords found within it. For example with www.canada-health-insurance.com we add pages with more details about dental coverage or pages with details about government coverage for each province. Every month there are new pages, so that every time the spider comes back to visit, it spends more time at the site reading new content. This is one half of the key to getting good listings in the search engine results pages (SERP).

Link Campaigns

The second half of the key is getting good sites to link to your site. Going after websites with related content, sites with good authority in your website’s field, and sites that are “popular” are the priority. Getting only reciprocal links is not the goal, getting the aforementioned sites to link to you because you have good, valuable content is the goal. Sites that do reciprocal linking usually have hundreds of links on their link pages and these will add very little value to your site. Don’t waste your time with reciprocal linking. Only link to a site if doing so will increase the value of your site in the eyes of your clients.

A link campaign is a lot of work, and it involves a lot of frustration and rejection. You have to approach bigger sites and sell the value that linking to your site will bring them. For every 20 sites you approach, you will be lucky to get one to link to you. You have to be persistent, consistent, and determined.

Conclusions

Optimizing a site is no longer something you can do and then forget about. For a site to succeed in the search engines today, it has to constantly be changing and growing either in content or in links, and ideally in both. It has to appear that the site is the life and soul of its creator, and that somebody cares enough about it to pay attention to it. Because after all, if the creator doesn’t care, why should the search engines?

Shawn Campbell

Shawn Campbell is the co-founder and Chief Search Engine Optimizer at Red Carpet Web Promotion, Inc.
www.redcarpetweb.com

What is Site Match?

While you may have heard of a controversial new program called Site Match, which is being run by the Yahoo/Overture team, what you probably don’t know is what this program is really all about. Site Match is a program created to get your site into the Yahoo database (formerly the Inktomi database) and it can be expensive. It is based on a yearly fee and an additional cost for every click you get from a Yahoo based search engine.

Demystifying the mysterious Site Match

Let me back up a bit and give you some history. On a hot July day in 2003, the directory giant Yahoo bought the colossus Pay Per Click (PPC) database Overture (previously known as Goto). Today, Yahoo has decided to monetize by offering us such programs as Site Match.

What Site Match is not

  • Site Match has nothing to do with the $299 fee you pay to get into Yahoo’s directory
  • Site Match will not get you better rankings in Yahoo (or in Overture)
  • Site Match does not get you into Overture’s auction-style PPC database

What Site Match is

Site Match ensures that your site is listed in Yahoo’s search database (not their directory), and that it is refreshed every 48 hours. If you are not listed in Yahoo’s search database you can do one of two things:

  1. Wait until Yahoo’s spider picks you up
  2. Pay Site Match to list you right away.

Unless your site is brand-spanking new, you are likely to already be listed in Yahoo’s search database. Yahoo’s spider (Slurp) does an extensive job of picking up websites to add to the database. To verify that you are indeed listed, you can type your domain into the Yahoo search box and see if your site comes up (type “yourdomain.com” without the quotes). If a result comes up, then you are in the database.

How much does Site Match cost?

Site Match costs $49 per year to sign up and $0.15 to $0.30 per click afterwards. If you are listed in Yahoo’s database, you get the exact same service for free (except that your site is refreshed every month instead of every 48 hours). What are the advantages of being refreshed? What does refreshed mean? It essentially entails that Slurp comes to visit your site and updates it to the Yahoo database every two days.

My Site Match test

I decided to test out a site to see if there are any benefits to using Site Match. I submitted www.PrintPot.com to the program on April 23rd. This site was created in early April, so it just got into the database a week before. The rankings before and after Site Match are as follows:

 

Keyword (Before Site Match)
Ranking April 23rd
(After Site Match)
Ranking May 3rd
print pot 9 14
epson inkjet refill kits 59 none
epson ink refill kits 65 none
epson refill kits 126 none
compatible epson ink cartridges 164 none
epson chip resetter 355 none

A ranking of “none” means that it did not turn up in the listings. As you can see, since we signed up for Site Match, our ranking dropped dramatically. Yahoo only shows the top 500-700 listings, and The Print Pot (which sells Epson inkjet refill kits) is not found at all.

This was my experience with Site Match, but it was only one experience. I doubt that I will be using Site Match again, nor would I recommend it to my clients. However, with only one test, it would be a mistake to conclude that the majority of sites will drop in listings after signing up for Site Match. What is troubling is how Site Match affected all the keywords that Print Pot was struggling to improve.

What happened? Was there a penalty? The site has no reason to be penalized as it followed all the content guidelines listed by Yahoo/Overture at http://www.content.overture.com/d/USm/ays/sm_gl.jhtml. These guidelines were pointed out by the support staff from PositionTech, a reseller of Site Match. So why did it drop? I suspect that Yahoo is still ironing out the bugs from its Site Match program. The other possibility is that Yahoo changed its algorithm, and the new one does not rank www.PrintPot.com highly. I have sent a letter to Yahoo and expect an answer in the near future. I will inform you of their response in the next www.RedCarpetWeb.com newsletter.

Should you pay for Site Match?

The answer is wonderfully complicated.

Option 1) If you are already in the database: The answer is a big fat NO WAY. The only exception would be if you change your site more often than once a week. Even then, Yahoo would only update the description, title, and ranking of your site on the search engine results pages. The link would still go to the new updated site even if you don’t pay, and even if it is not freshly spidered.

Option 2) If you are not in the database because your site is new: It would normally take 2-5 weeks to be included in Yahoo’s database for free. If you are in a hurry, then sign up to Site Match. You should show up within 48 hours, and you will be paying $0.15 to $0.30 per hit.

Option 3) If you are not in the database but your site has been online for over 2 months: Something is wrong with your site. Either your robots tag is wrong, or you have zero links in to your site, or you have a penalty of some kind. In order to resolve your situation, it is imperative that you need to hire a search engine optimizing specialist to inspect your site.

Conclusions

Unless you are running a site that gets updated on a near daily basis (such as a news site or a web log), my advice is not to sign up for Site Match. You would be throwing your money at Yahoo, and in return you would be getting a service you probably don’t really need. If you are not listed in their database, there is probably a reason for it, and that same reason would probably keep you out of the database even if you signed up to Site Match.

Shawn Campbell

Shawn Campbell is the co-founder and Chief Search Engine Optimizer at Red Carpet Web Promotion, Inc.
www.redcarpetweb.com

Yahoo’s Back!

I was all set to write an article predicting the future of search engines, when Yahoo dropped Google and replaced it with its own engine. Now that’s big news. In less than twenty-four hours, Google went from about 79% of the market share to about 51%, almost overnight. And what a welcome relief it is too! Being #1 in Google was great, but when you had the misfortune of dropping even a couple of positions you really felt it. Now there will be more stability; if you drop in Google today, your hits from Yahoo will remain consistent.

What is the new Yahoo?

Last year, Yahoo bought the AltaVista, Fast, and Inktomi search engines. The new Yahoo results are none of these. Many people are saying that the results come from a new Inktomi because the results are similar; but the results are also similar to all the other search engines out there. In comparing these engines, it seems to me that Yahoo’s results are from a brand new engine. Maybe they took parts and ideas from all the search engines they bought – maybe they even took the best parts – , but whatever they did, the result is something completely new.

Which search engine is better?

I will be comparing Google and Yahoo for the terms “music”, “art prints and posters”, “Bahamas real estate”, “mosquito nets”, and “liposuction”. The other search engines all hold less than 4% of the market share (except for MSN which uses Yahoo’s Inktomi), so I won’t be considering them. Here is what I found in the top 10 results for each keyphrase:

Music

Yahoo offers a lot of music resource sites. Information about music from different sources such as magazine, TV and other music news sites are found 6 times in the top 10 results. It also offered downloading and file sharing programs 3 times. The 10th result was an audio player program site.

Google has a lot more diversity. There were 3 music resource sites (but no magazines), one downloading program, one CD store, one radio station (Yahoo radio), one music directory, and the 10th result was an audio player program site. Google also had 2 sites in its top ten that were of no value whatsoever; MP3.com which just has one page stating that they no longer offer the services that they used to (with links to their parent company), and music.com, which is nothing more than an email gathering site for a newsletter (not a single link on the entire page).

Google’s diversity is a big bonus, but the 2 spam/junk/useless sites really hurt it. The results? Yahoo 1, Google 0.

Art Prints And Posters

Yahoo offers 6 stores, while Google offers 5. The other links are all affiliate spam with no content whatsoever (just links to stores), with the exception of one of Yahoo’s links, which has some biographical content about artists. So Google has 5 spam sites, and Yahoo has 3 and a half.

Yet another round goes to Yahoo.

Bahamas Real Estate

For this keyphrase, I found results between Google and Yahoo to be quite similar. The only differences were sites by actual realtors and sites that were simply property listings. Both types of results are useful, with Google having an edge in realtors. Google had some lower quality sites, but the information was just as good even though they did seem less professional. On the other hand, Yahoo did have one site that was nothing more than a links page from another realtor’s site. Big boo boo.

This one goes to Google.

Mosquito Nets

It seems to me that someone searching for “mosquito nets” wants either A) to buy them, or B) to learn about them, so I was expecting to find either stores or information about mosquito nets. Yahoo showed me 6 stores and 2 informational pages. The other results were a search result page (not a good result) and an inner page from a previous result (also not a good result).

Google gave me 7 stores and 3 charitable organizations (one of which was a store as well). The other 2 charitable organizations were a news article outlining what they had done regarding mosquito nets and information about mosquito nets.

So even though I didn’t necessarily want the latest news about what a charity did regarding mosquito nets, I think getting the same site twice from Yahoo (not to mention the search result page) is the bigger no-no. Google wins this round.

Liposuction

I expected to find information about liposuction, liposuction organizations and either doctors or centers where you can have liposuction done. What I got was a lot of “how to find a doctor” sites, with a lot of good information.

Yahoo results included 3 sites doubled. This is a problem that killed AltaVista in the late 90’s. Hopefully they will have it fixed soon. Other than the 3 doubled sites, the results included 4 informational sites, 2 sites for finding doctors and one poorly written article about the history of liposuction.

Google gave me 4 good informational sites, 2 good “find a doctor” sites, one recent article about liposuction for people in the industry, one site with very poor information one written by a single doctor and one site that was nothing more than a directory.

Google gets this round as well. Overall, it looks like Yahoo needs to fix its doubling of sites and Google needs to clean out some spam (poor sites).

And The Winner Is…

You! Having two good search engines to choose from makes searching that much better for everyone. It also makes getting listings better. It also makes marketing better. It also makes traffic to your site steadier. The only way this could have been worse is if Yahoo’s results sucked, and they don’t. They seem just as good, if not better, than Google’s.

So rejoice, and enjoy a more dynamic world of online searching!

Shawn Campbell

Shawn Campbell is the co-founder and Chief Search Engine Optimizer at Red Carpet Web Promotion, Inc.
www.redcarpetweb.com

Google’s Florida Update

On November 16th, Google did a major overhaul of its results. Many legitimate sites that were basking in the sun at the top of Google results plummeted down the rankings into a dark abyss of countless other results. Many webmasters and search engine optimizers went into shock, as they saw their Christmas sales sink like a stone in the water. Many people got angry. Many people wrote letters and participated in forums. Many people complained, and many people wept. Basically, anyone who wasn’t a multinational conglomerate or an educational institution felt the pangs of rejection. Google shook the Internet… again.

This Google update has been nicknamed the Florida update in honour of the Florida election fiasco in 2000.

What changed?

On average, about 50% of all results in the top 100 have now fallen below the top 500. These changes only apply to certain terms, usually the terms that are related to commercial searches. You can see if your site has fallen at www.google-watch.org/scraper.html by typing in your keyword and looking for your site. This site compares the old Google results with the new results and counts how many sites are missing from the new top 100. Google is currently trying to block these results by blocking Google-watch.org’s IP address, but as of the writing of this newsletter the site is still in operation.

The Theories

There are many theories about what Google did and why. One of them is that Google removed commercial sites from their free listings in order to get the merchants to buy more AdWords. I disagree with this theory. I don’t believe they did it to monetise. I think that Google’s intention was to diversify the types of sites in the top results in order to provide better results to users. I think the profit they will make from this change just happens to be a happy side effect for Google. One thing is for sure, there are much fewer commercial listings in the top results than ever before. Many have been replaced by educational, governmental, or directory listings. The top ten in many results are now a mix of informational, authoritative commercial, and directory listings. This gives searchers more of a choice in the kind of site they want. The commercial sites that survived tend to be the leaders in the industry.

Here are some of the theories about the changes at Google:
www.webworkshop.net/florida-update.html
searchenginewatch.com/searchday/article.php/3286101
www.searchengineguide.com/hotchkiss/2003/1215_gh1.html
www.searchengineguide.com/terry/2003/1205_tv1.html
www.webpronews.com/wpn-4-20040108GoogleRumorsThatNeedToBeStopped.html

What to do

Since the big update, many sites that were dropped have been crawling their way back into the top results. It seems to be a slow and painful process. Google’s Senior Research Scientist Craig Nevill-Manning actually apologized for the update, saying: “I apologize for the roller coaster. We’re aware that changes in the algorithm affect people’s livelihoods. We don’t make changes lightly.” The good news is that if a site has a lot of good content, then Google seems to care. The more content you have, the better Google seems to like you.

Many search engine optimizers are frantically making changes, but until we understand more about what motivated the update, such a reaction becomes the equivalent of thrashing and flailing about in the water. Right now the best thing to do is to continue adding good quality content to your site, make sure you are not using spammy techniques, and continue getting links from quality sites and directories.

Hold the boat steady, and you will weather the storm.

Shawn Campbell

Shawn Campbell is the co-founder and Chief Search Engine Optimizer at Red Carpet Web Promotion, Inc.
www.redcarpetweb.com

The Future of Inktomi

In December of 2002, Yahoo! bought Inktomi for $235 million.

As things currently stand, Yahoo, MSN, and Google each receive about 30% of online search traffic. However, because Google’s search results are used for Yahoo! searches, Google actually captures about 60% of the market. Since the sale of Inktomi, we search engine optimizers have been collectively holding our breath waiting for Yahoo! to dump Google and get in bed with Inktomi for Yahoo! search. That would decrease Google’s stranglehold on search traffic and create more competition.

Where does Inktomi get its traffic?

Inktomi provides results for MSN search. Currently, when someone does a search on MSN, they get the following results:

– MSN’s own “featured sites” results (paid advertisers) (usually 1-5 results)
– Overture’s “sponsored sites” (pay-per-click advertisers) (usually 3 results)
– LookSmart’s “web directory sites” (anywhere from 0 to 50 results)
– Inktomi’s “web pages (leftovers)

Inktomi is also the back-up for Overture results (after Overture’s own sponsored links), and is used for a myriad of smaller web portals and search engines, such as About, BBC, Espotting, Goo, HotBot, and InfoSpace.

The good news for Inktomi is that MSN will be dropping LookSmart in mid-January of 2004, and replacing its results with Inktomi’s. Add that to the Yahoo! swap (Inktomi for Google), and Inktomi will go from having a very small share of search traffic to having almost 60% of the total.

Inktomi’s Quality

Obviously, Inktomi is about to become very important, so I looked into Inktomi’s search results, in the hope of figuring out how to optimize for this born-again search company. The results however, were not very encouraging. There was a lot of doubling (the same site showing up 2 or 3 times), and a lot of spam. Mind you, there is a lot of spam in Google too (Spam is any web page that attempts to deceive a search engine’s relevancy algorithm, usually resulting in pages that are irrelevant to a user’s search).

The interesting thing was that the results differed a great deal from Google’s. This is good for the consumer (different results means real competition), but bad for the search engine optimizer. Ideally, a well-optimized site should show up in all the search engines, but Google and Inktomi have different ideas about relevancy. This means we will be seeing a lot of sites built twice: once for Google, and once for Inktomi. And more sites mean more spam.

Hopefully, in the end, the high-quality sites will come out on top of both engines. I imagine there will be either similar relevant sites, or different, but still relevant sites in both Google and Inktomi.

Shawn Campbell

Shawn Campbell is the co-founder and Chief Search Engine Optimizer at Red Carpet Web Promotion, Inc.
www.redcarpetweb.com

Tips for pay-per-click bidding

As you may know, Overture was recently bought by Yahoo! Due to the publicity generated by the deal, now is a good time to review some tips for bidding in pay-per-click (PPC) campaigns on Overture and Google AdWords.

How does a pay-per-click search engine work?

With a PPC search engine, you bid a certain amount for your chosen keyphrase. Whenever someone searches for that keyphrase, and clicks on your website’s link under it in the search result, you pay the bid amount (or less, depending on your competition). Generally, the higher you bid, the higher your placement is in the search results.

Overture

Overture was the biggest PPC search engine before Google AdWords came along. Now they are both fighting for the top spot, leaving the rest of the pack in their wake. Overture works on a strict auction model: the higher you bid, the higher your position is. Overture’s results are included in the sponsored results at the top of Yahoo!, MSN, Lycos, HotBot, and others. They claim to reach over 80% of all Internet users.

Google AdWords

Google’s AdWords program started in February 2002, and quickly became Overture’s only serious competitor. Google’s system is different from Overture’s in that the bidding is only one part of the ranking equation. The other part is the click-through rate (how often people click on your ad). Google AdWords are found on AOL, Netscape, Ask Jeeves, Teoma, Earthlink, and, of course, Google. Google states that their AdWords appear 200 million times a day. I figure that they also reach about 70% of all Internet users.

Other PPC search engines:

There are hundreds of PPC search engines out there, but you really only need to advertise with the top two. If you want to see some lists, you can go to www.PayPerClickSearchEngines.com

Pay-per-click tips

Here are some tips for running a PPC campaign:

  1. The #1 rank is not always the best. In fact, you can usually get a better return on investment (ROI) by being the second or third result in a search. The reason is that people will often click on the first result without thinking. They then realize that the site does not offer what they want, and they will come back and think (and read the description) before clicking on the second or third results. I mention only the second and third results because usually, only the top three results get published (Yahoo, MSN, and many more). Often the second and third results are much less expensive than the number one spot.
  2. Bid on as many relevant, highly specific, low cost keyphrases as you can afford. A keyphrase with only one or two keywords will usually cost much more than one with three or four words. Longer keyphrases also tend to be more targeted (for example, shiny blue widgets, instead of just plain old widgets). Thus, with longer keyphrases, you get lower costs and a higher return on investment. If you bid on enough of these targeted keyphrases, you can usually generate enough traffic to match what you would receive for a single-word keyphrase. To summarize, bidding on shiny blue widgets, pre-owned utility widgets, and zebra-stripped widgets, will cumulatively generate the same amount traffic as just bidding on widgets, but with a higher ROI because they cost less per click.
  3. Include your keyphrases in your title and descriptions. Think hard about your description because generally, the best description gets the most traffic (not always the highest ranking result).
  4. Use objective, not subjective language in your descriptions. Subjective descriptions will state how great the website is. Objective descriptions are ones that list the benefits of a website, or mention what the surfer can expect to find. Try to point out what is unique about your website.
  5. Create highly relevant landing pages for your PPC campaign. These landing pages (where the PPC link goes) are what will convert a surfer into a buyer. You have already paid for the surfer to see this page, so use your resources to make it into a good conversion page. Also, keep separate, track of buyers that arrive via your PPC campaign, and buyers that arrive via other means. That way, you can track your ROI, and figure out how much you should spend on the PPC engines.

Other helpful resources:

Shawn Campbell

Shawn Campbell is the co-founder and Chief Search Engine Optimizer at Red Carpet Web Promotion, Inc.
www.redcarpetweb.com

Google Shakes Its Bootie

Google has made some major changes in the last two months. Normally, there is what we call a “Google Dance” every month. This lasts about a week, and it occurs as Google integrates new sites into its database. Google also tweaks its algorithms during this period, so most sites move up and down in the search results.

A side effect of this dance is the spread of the Google Dance Syndrome — a disorder that causes search engine optimizers and webmasters to pull out their hair and chew their fingernails while waiting for this algorithmic game of musical chairs to stop, so that they can see where their site ended up. The Search Engine Watch has more details about the syndrome.

The Big Shake

This May, Google did a major overhaul of its database, which contains over 3 billion web pages. They dropped a lot of sites, and they played with a lot of spam triggers. A spam trigger is a part of the algorithm that triggers a spam penalty for a site. For example, they have made the trigger for hidden text more sensitive, so that if you have a small amount of hidden text on your site, you will get penalized.

These spam triggers have caused a lot of headaches in the SEO world, but Google seems to be correcting any over-sensitive triggers that they create (although it will take two months or more of lost traffic before the correction is implemented).

Don’t Panic!

The bottom line is that if your site has been penalized, or even removed from the database, don’t panic. Just send a nice letter to webmaster@google.com explaining why you think you were penalized, and what you have done to amend the situation. I have found that you get a reply about 50% of the time, and that it usually works about 50% of the time (sometimes you will get a reply that is unhelpful, and sometimes they will correct the penalty without a reply).

What To Do

Here is something for you to do while you are twiddling your thumbs: try submitting to other search engines. Google after all, is not the only kid on the block. True, it represents 80.7% of the market (Google, Yahoo!, & AOL), but MSN still has 9.6%. So make sure you are in MSN by submitting to Zeal and/or Inktomi. You may also wish to spend time submitting to Dmoz.org (Open Directory Project), the Yahoo! directory, and exploring any industry directories or associations that would consider listing your site. Lastly, consider starting up a pay-per-click campaign with Google or Overture. In my research, I have found that Google’s AdWords have a higher rate of return than Overture, so I would recommend advertising with Google.

Global average usage share

Google 55.2%
Yahoo! 21.7%
MSN Search 9.6%
AOL Search 3.8%
Terra Lycos 2.6%
Altavista 2.2%
Askjeeves 1.5%

All numbers are an average of the last two months, from OneStat.com.

One final note… There is a lot of speculation that Yahoo! will be using Inktomi, or a combination of Inktomi and Google, in the near future. This means Inktomi will be taking the lion’s share of Yahoo’s 21.7% of market away from Google. So make sure you are listed in Inktomi, so that you don’t scramble when the changeover occurs.

Next issue, I will update you on other changes in the industry.

Shawn Campbell

Shawn Campbell is the co-founder and Chief Search Engine Optimizer at Red Carpet Web Promotion, Inc.
www.redcarpetweb.com

Search Engines Buying Search Engines

Headlines

  • Yahoo! Buys Inktomi
  • Overture Buys Altavista.com
  • Google Buys Blogger.com
  • Overture Buys AllTheWeb

What is going on?

Well, there has always been a little incest between the search engines. MSN currently uses LookSmart and Inktomi. Yahoo and AOL both use Google and, of course, so does Google. So the current market has Google providing results for 3 of the big 4 search sites. This creates a hefty unbalance, and the industry has been holding its breath waiting for the hatchet to come down.

Yahoo buys Inktomi

It looks like Yahoo has made the first step. By buying Inktomi, Yahoo could potentially use results that they own, instead of paying for them “per click” from Google. Another scenario would have Yahoo combining Google and Inktomi results with its own directory results, making it into a meta search engine. Yahoo’s purchase of Inktomi also gives Yahoo a great paid inclusion program. Thus, not only does the purchase have the potential to make Yahoo into a better search site, but it will also make them money immediately.

Overture buys Altavista and AllTheWeb

Overture buying up Altavista and AllTheWeb is not so clear cut. Overture is a pay-per-click search engine that makes the bulk of its money by posting its results (sponsored sites) on Yahoo and MSN. It currently uses Inktomi to “fill up” any results where there are no bidders. Of course, on Yahoo and MSN the Inktomi results never show up (because they don’t pay).

So what will Overture do with its two new acquisitions? Well, Google is Overture’s only real rival in the pay-per-click arena. Many people think that Overture bought AllTheWeb to compete with Google in the search engine spidering arena. Another reason for the acquisition is that Overture can cash in on Altavista and AllTheWeb’s paid inclusion programs. Overture will probably continue to focus on their pay-per-click, while using the eyeballs at Altavista and the technology at AllTheWeb to improve their own services. They almost certainly will replace the Inktomi results (which they do not own) with either one of their new purchases.

What about MSN?

MSN hasn’t bought any big search engine recently. What will they do in the future? Many people are speculating that MSN will drop Inktomi now that their main rival Yahoo owns it. This is probable in the long term, but I don’t think they will make any bold moves soon. MSN recently pointed out that Google is definitely a rival, so it looks like they are targeting their crosshairs there. Another possibility is that they become better buddies with Overture, using Altavista or even AllTheWeb’s search results instead of Inktomi’s. They could also buy up Wisenut, another spidering search engine, which is owed by LookSmart.

Only time will tell

One thing that I think is for sure (if anything is in this industry), is that a good ranking in Inktomi’s listings is going to be a lot more valuable once Yahoo incorporates it, so get your site tweaked. A site that has been optimized for Inktomi will do better in the search engine results.

Next issue, I will update you on other changes in the industry.

Shawn Campbell

Shawn Campbell is the co-founder and Chief Search Engine Optimizer at Red Carpet Web Promotion, Inc.
www.redcarpetweb.com

Picking Apart PageRank

Google is currently the darling of web surfers. With robust algorithms such as PageRank, Google helps users find relevant results, quickly. But while PageRank may be a boon for searchers, it is also the bane of webmasters because it is one of the most difficult ranking factors to control.

PageRank is the brainchild of Google co-founders, Sergey Brin and Larry Page. It is a system for ranking web pages that is based on an assumption popular among academics: that the importance of a research paper can be judged by the number of citations it has from other researcher papers.

The pair simply came up with the web page equivalent: the importance of a web page can be judged by the number of links it has from other web pages.

To find out what a website’s PageRank is, you’ll need to install the Google Toolbar for Internet Explorer.

The Google toolbar sits underneath your address bar and displays a bar graph representing the PageRank of the page you are viewing.

toolbar

How it All Works

When a user visits Google and enters a query, several things happen. First, Google finds all the web pages in its index that match the search term. Next, out of these results, Google selects a subset of web pages that have the greatest relevance to the query.

At this point, PageRank is not a factor at all. Google first looks at all the usual factors such as keyword density and prominence to calculate relevance. PageRank only comes into play as a multiplier after all these other factors have been calculated. In other words:

Final Ranking = (score for all other relevance factors) x (PageRank rating).

To determine a page’s PageRank, Google looks at a web page and counts how many incoming links are pointing to it. Google regards these links as “votes”. If one site links to another site, it is essentially casting a vote for that site.

Google doesn’t just count the total number of “votes” or links that a web page receives to determine its PageRank however; it also analyzes the web page that casts the vote.

Votes cast by pages that Google deems “important”, i.e., sites that already have a high PageRank, are given more weight and help to increase the PageRank of the web pages they link to.

The actual PageRank of a web page is calculated as the sum of the PageRank of all the web pages linking to it, divided by the number of outgoing links on each of those pages.

Improving Your PageRank

Improving your website’s PageRank may sound easy: just find sites with a high PageRank to link to your site. In reality however, it’s not that simple.

Many webmasters with sites with a high PageRank, will not link to a site with a lower PageRank; it simply isn’t worth their while to do so. Moreover, even if they do link to your page, if they also link to numerous other pages, the PageRank is divided among all the outgoing links.

Consequently, it may actually be beneficial to propose link exchanges with quality sites with a slightly lower PageRank: competition for links from such sites is less fierce and webmasters may be more willing to reciprocate links.

Click here for additional tips on improving your PageRank.

The Trouble with PageRank

While the premise behind PageRank may hold true within the halls of academia, when applied to web pages, its flaws start to show.

Although it would seem like common sense that a website would only link to another site if it had good content, in reality, websites link to sites with poor content all the time. Webmasters may engage in purely commercial link exchanges, or they may link to a page because they use that website’s counters or banner ads on their own website.

Moreover, affiliate websites that generate revenue through pay-per-click links may artificially inflate their client’s PageRank, thus undermining any notion of a natural PageRank.

New sites are often the worse affected by PageRank. Regardless of their quality, new sites will always have fewer incoming links and therefore, a lower PageRank. Consequently, getting sites with a higher PageRank to link to them will be difficult.

websites with a good PageRank however, have no trouble soliciting links. Because of their good PR, they tend to rank highly in the search engine results pages. Since they rank highly in the results pages, people tend to link to them, creating a vicious cycle.

Final Thoughts

While PageRank is one of the hardest factors to influence, it can still be manipulated. As more and more people discover these strategies, the utility of PageRank will undoubtedly be diminished.

-Julie Joseph

Julie Joseph is a search engine optimizer and copywriter at Red Carpet Web Promotion, Inc.