Advertise | New Sponsors | Top Sites | New Listings | Articles / New | Sponsor Login
19 common mistakes that prevent your Web site from showing up on search engines

19 common mistakes that prevent your Web site from showing up on search engines

Many webmasters have the problem that their Web site is not listed in search engines at all. There can be a variety of reasons that your Web site doesn't show up on search engines.

Axandra.com 


  1 2 3

Reason #7: Your Web pages are created dynamically.

Databases and dynamically generated Web pages are great tools to manage the contents of big Web sites. Imagine you'd have to manage the Web site contents of the New York Times without databases...

Unfortunately, dynamically generated Web pages can be a nightmare for search engine spiders because the pages don't actually exist until they are requested. A search engine spider is not going to be able to select all necessary variables on the submit page.

The exceptions are the spider programs from Google and Inktomi. They are able to index Web pages that are dynamically generated, even those that use question marks and query strings.

On the other side, AltaVista isn't able to index dynamically generated Web pages, and here's why:

http://help.altavista.com/adv_search/ast_haw_wellindexed

If you create dynamic Web pages with the help of Active
Server Pages (ASP), ColdFusion, CGI, Perl or the Apache
Server, then the following Web page offers good avice:

http://spider-food.net/dynamic-page-optimization-b.html

Reason #8: You have moved your site to a new server.

Every time you move a Web site to a Web hosting company, or when you change the domain name, the IP/DNS address of your domain name changes. While people see URLs (www.yourdomain.com), search engines often only see IP addresses.

This means that you need to re-submit your Web site to all search engines and directories when you move your Web site to a new Web hosting company or when you change your domain name. In time, search engines will realize that your IP/DNS address has changed and will remove the old IP address from their index.

It's important that you re-submit only when the move is completely finished. A good way to know when the IP address change has been updated is to upload a slightly different version of the index page to the new server. When you enter www.yourdomain.com in your Web browser and see the different version you'll know that DNS servers of your ISP (Internet Service Provider) have been updated.

If you have changed your server, resubmit your Web site URL to search engines. If you have changed your domain name, you should also change your Yahoo and Open Directory Project listings.

If you want to change your Yahoo listing, go to:

http://add.yahoo.com/fast/change

If you want to change your Open Directory Project listing:

  1. Go to "http://dmoz.org/ ".
  2. First locate the category in which the site appears in (search for your domain name), and go to that category.
  3. Click the Update URL link at the top of the Web page.
  4. Enter the URL of the Web site you would like to update.
  5. A new page loads where you can make changes.
  6. Click the Update URL button once you've finished with the changes.

If you change your domain name, it's likely that you lose your old visitors. Here's advice how to reduce the loss of visitors:

http://www.thesitewizard.com/archive/movinghosts.shtml

Reason #9: Your Web page does not have unique IP address.

Does your Web site has a unique IP address? If not, your Web site is running the risk of getting banned from the search engines.

Human beings use domain names like yahoo.com, but network computers use IP addresses, which are numeric addresses written as four numbers, separated by periods.

Every domain name translates to a so-called IP address. For example, yahoo.com is translated to "64.58.76.225". Just enter "http://64.58.76.225/" in your Web browser and you'll go to www.yahoo.com.

Many Web hosting services don't give out unique IP addresses to their customers to save money. They assign the same IP address to multiple domain names. This means that several hundred Web sites could all be using the same IP address as your site does.

There are 3 reasons why you need a unique IP address:

  1. If you're sharing an IP address with 50 other sites, you're trusting them not to over-submit or spam the search engines. When a search engine blocks an IP address, all the sites that are sharing that IP address are blocked. You could wind up being banned from the search engine.

  2. If the server or the search engine spider software is misconfigured, the search engine spider may end up obtaining a Web page from another domain with the same IP address. This may mean that the other Web site gets indexed instead of yours, or your Web site will be found for the keywords which are applicable to the other site.

  3. Rumor has it that having your own unique IP address may help your search engine ranking.

So when you select a Web hosting service, make sure that your domain name has a unique IP address, even if it means that you have to pay a bit more for your hosting.

Are you sharing an IP address with people you don't even know? Here's a way to test it yourself:

  1. Go to " http://www.name-space.com/search/ " and enter your domain name (for example, yahoo.com).

  2. The result page shows you what IP address your site resolves to (for example, 64.58.76.225)

  3. Copy the IP address to the clipboard.

  4. Open a new window in your Web browser, enter the IP address (for example, http://64.58.76.225 ) and hit Return.

  5. If your Web site appears, you have your own IP address. If another Web site or an error message appears, you probably share the IP address with others.

If you are unsure, ask your Web hosting service company if your Web site has its own IP address.

Reason #10: You are hosting your Web site at a free Web space provider.

Some search engines (e.g. AltaVista) limit the number of pages they will index from a single domain. For example, if your Web page is hosted at Geocities.com or Tripod.com, it might happen that your Web site is not listed just because the maximum page limit for that special domain name is reached.

Some search engines no longer even index pages residing on common free Web hosting services. Their complaint is that they get too many spam or low-quality submissions from free Web site domains.

However, Google is the exception. Google does index Web pages on Geocities.com and Tripod.com. Those pages also seem to have a Google PageRank of at least 3/10 because they are linked from a popular domain.

In summary, if you are serious about doing business on the Web, it helps tremendously to have your own domain name. Put yourself in the customer shoes: Would you buy from a Web site that is called "sub.free-web-space- provider.com/~category/162742/ my_business/home.htm"?

A domain name is easier to remember for your customers and it helps to build trust.

And while we're at it: A recent survey by Consumers Union found that only 29 percent of Americans trust Internet sites that sell products.

Consumer WebWatch research report on trust:
http://www.consumerwebwatch.org/news/1_abstract.htm

How to gain your customers' trust:
http://www.ecommercebase.com/article/37

Reason #11: Your host server was non-operational during spidering.

It can happen that your Web server is down when a search engine spider tries to index it.

If your Web site fails to respond when the search engine spider visits your site, your site will not be indexed. Even worse, if your Web site is already indexed and the search engine spider finds that your site is down, you'll often be removed from the search engine database.

It's essential to have Web space on servers that are very seldom down. Choosing a reliable Web space provider is very important for a successful online business.

At first it sounds impressive when your Web space provider promises 99% server reliability. But take a moment to calculate it. It means that 1% of the time, your potential customers cannot reach your Web site. One percent of a year means that your Web site will be down for nearly 4 days per year. That equals 4 days without sales.

As you can see, 99% reliability is not enough. You must constantly monitor your server so that you can act immediately if your server is down (e.g. call the service provider to restart the server).

Here's a free monitoring service:
http://www.internetseer.com

Another free Web site monitoring service (without ads):
http://uptime.openacs.org/uptime/

A freeware tool that can monitor the uptime of your site:
http://www.idyle.com/software/internet/host-monitor/

Monitoring service for bigger companies:
http://www.atwatch.com/

Reason #12: You don't allow robots to index your Web site.

Imagine you're a Internet marketing service company and you keep trying very hard to get a top ranking in the search engines for your customer.

Even after several weeks, the customer's Web site hasn't been listed in any search engine. Then you start to realize that the search engine spiders and robot programs cannot access the Web site because your customer blocks them (by mistake).

There are two ways to block search engine robots: a) with a simple text file in the root directory of the host server, or b) with a certain META tag in the Web pages.

a) Robots.txt

The host server might have a plain text file named "robots.txt" in the root directory. It contains rules for the search engine spiders. The rules in the robots.txt file follow the Robots Exclusion Protocol, a document designed to help Web administrators and authors of Web spiders agree on a way to navigate and catalog Web sites.

The content of the robots.txt file consists of two main commands: "User-agent" and "Disallow".

The User-agent command specifies the name of the robot for which the following commands should be applied to. You can set this to "*" to have the spidering commands applied to any robot.

The second command, "Disallow", specifies a partial URL that should not be indexed by the Web robot.

The text

User-agent: *
Disallow: /

tells all search engine spider programs to go away. If you find a text file called "robots.txt" in the root directory of the host server with the above content, you should delete it immediately. The text file says that no search engine is allowed to index your Web site.

Even if your robots.txt file don't contain the above commands, you should make sure that its syntax is correct. A robots.txt file with a faulty syntax also prevents search engine spiders to index your Web site.

To check the syntax of your robots.txt file, you can use this free tool (just enter your domain name www.domain.com): http://www.sxw.org.uk/computing/robots/check.html

b) The META ROBOTS tag

There's a second way to stop search engine robot programs to index your Web site: the META ROBOTS tag. If you find the following HTML tag in your Web pages:

<META NAME="robots" CONTENT="noindex,nofollow">

you should replace it immediately with

<META NAME="robots" CONTENT"="index,follow">

If you want all search engine spiders to index all Web pages, you can also remove the META ROBOTS tag from your Web pages.

Further information about both ways to stop search engines to index your Web site can be found at:

- The Importance of Robots.txt
- http://www.wdvl.com/Location/Search/Robots.html
- http://www.ebrandmanagement.com/whitepapers/robots2.htm


  1 2 3

 

About The Company

Axandra.com

Web site promotion software tools.

 
Rating: 4.00 (6 votes)
 
SEO & Search Engines | Article Directory | New Articles
 

Rate it

19 common mistakes that prevent your Web site from showing up on search engines

Please rate this article between 1 and 5 with 5 being top.









BusinessSeek.biz Business Directory © 2003