| This is a LARGE document and will take a few moments to load. Thank You! | ![]() |
Web Site Traffic Building Basics
A few changes may be enough to make you tops in one or two search engines. But that's not enough for some people, and they will invest days creating special pages and changing their sites to try and do better. This time could usually be put to better use pursuing non-search engine publicity methods.
Naturally, anyone who runs a web site wants to be in the "top ten" results. This is because most users will find a result they like in the top ten. Being listed 11 or beyond means that many people may miss your web site.
The tips below will help you come closer to this goal, both for the keywords you think are important and for phrases you may not even be anticipating.
Finally, know when it's time to call it quits. Don't obsess over your ranking. Even if you follow every tip and find no improvement, you still have gained something. You will know that search engines are not the way you'll be attracting traffic. You can concentrate your efforts in more productive areas, rather than wasting your valuable time.
Types of Internet Listing Resources
For web surfers, the major search engines generally mean more dependable results. These search engines are much more likely to be updated frequently and to keep up with all the new pages submitted every minute.
For webmasters and site owners alike, getting listed in the top 40-60 positions in a search engine is much more likely to bring traffic and hits to one's site. For example, a website that is listed in Yahoo! will most likely receive more traffic than one listed in Magellan. While both are fantastic engines, Yahoo! is much more well known and likely to drive more traffic to your site.
Search Engines, Directories and Combinations of the Two
A webmaster and web surfer should know the difference between these types of sites, because they are often misused and confused.
Directories:
A directory such as Yahoo depends on humans for its listings. You submit a short description to the directory for your entire site, or editors write one for sites they review. A search looks for matches only in the descriptions submitted. Changing your web pages has no effect on your listing. Things that are useful for improving a listing with a search engine have nothing to do with improving a listing in a directory. The only exception is that a good site, with good content, might be more likely to get reviewed than a poor site.
Search Engines:
Commonly referred to as "spiders" or "crawlers," Scribble39search engines are searching the web for new pages atall times. Because they are automated and index so many sites, search engines may often find information not listed in directories. To the flip side, they may also pull up unrelated information for the topics you're searching for. Search engines, such as HotBot, create their listings automatically. Search engines crawl the web, then people search through what they have found. If you change your web pages, search engines eventually find these changes, and that can affect how you are listed. Page titles, body copy and other elements all play a role.
Combination Search Engines:
Some search engines maintain an associated directory. Being included in a search engine's directory is usually a combination of luck and quality. Sometimes you can "submit" your site for review, but there is no guarantee that it will be included. Reviewers often keep an eye on sites submitted to announcement places, then choose to add those that look appealing.
The Parts of a Search Engine
Everything the spider finds goes into the second part of a search engine, the index. The index, sometimes called the catalog, is like a giant book containing a copy of every web page that the spider finds. If a web page changes, then this book is updated new information.
Sometimes it can take a while for new pages or changes that the spider finds to be added to the index. Thus, a web page may have been "spidered" but not yet "indexed." Until it is indexed -- added to the index -- it is not available to those who are searching with the search engine.
Search engine software is the third part of a search engine. This is the program that sifts through the millions of pages recorded in the index to find matches to a search and rank them in order of what it believes is most relevant.
How Search Engines Rank Web Pages
Overall, search engines do a tremendous job in pulling up relevant information, and getting your site listed in the top of these engines can mean tremendous results for your site. Search engines don't have the ability to ask questions, so they rely on what you've entered for your search. While this may be changing with the likes of intelligent agents, don't expect the same kind of customer service you might find from your local librarian.
Search engines will also check to see if the keywords you've entered appear in the top of the web page, like in the headline or in the first few paragraphs of text. Engines will assume that if the topic is important, it will be mentioned within the first part of your site. Frequency will also factor in how search engines determine relevancy. A search engine will determine how often keywords appear in relation to other words in a web page. Those with a higher frequency are often deemed more relevant than a page with a lower frequency of keywords.
Note: Some search engines index more web pages than others. This means that no two engines will bring up the same pages. Each has their own method of pulling up information, resulting in different information being considered more relevant. Some search engines also give web pages a bonus for various reasons. For example, WebCrawler uses link popularity as part of its ranking method. It can tell which pages in its index have lots of links pointing at them. These pages are given a slight bonus during ranking, with the reasoning being if a page has a lot of links to it, it's probably a very popular page. Some combination type engines, those containing directories, may give a bonus to sites they've reviewed.
In particular, that means you need HTML text on your page. Sometimes sites present large sections of copy via graphics. It looks pretty, but search engines can't read those graphics. That means they miss out on text that might make your site more relevant. Some of the search engines will index ALT text and comment information, along with meta tags. But to be safe, use HTML text whenever possible. Some of your human visitors will appreciate it, also.
Be sure that your HTML text is "visible." Some designers try to spam search engines by repeating keywords in a tiny font or in the same color at the background color to make the text invisible to browsers. Search engines are catching on to these and other tricks. Expect that if the text is not visible in a browser, then it won't be indexed by a search engine.
Finally, consider "expanding" your text references, where appropriate. For example, a stamp collecting page might have references to "collectors" and "collecting." Expanding these references to "stamp collectors" and "stamp collecting" reinforces your strategic keywords in a legitimate and natural manner. Your page really is about stamp collecting, but edits may have reduced its relevancy unintentionally.
Avoid Search Engine Stumbling Blocks
Have HTML links:
Often, designers create only image map links from the home page to inside pages. A search engine that can't follow these links won't be able to get "inside" the site. Unfortunately, the most descriptive, relevant pages are often inside pages rather than the home page.
Solve this problem by adding some HTML hyperlinks to the home page, a strategy that will help some of your human visitors, as well. If you put the HTML links down at the bottom of the page, the search engine will find them and follow them.
Also consider making a site map page with text links to everything in your web site. You can submit this page, which will help the search engines locate additional pages within your web site.
Frames can kill:
Some of the major search engines cannot follow frame links. Make sure there is an alternative method for them to enter and index your site, either through meta tags or smart design. For more information, see the tips on using frames.
Dynamic Doorblock:
Generating pages via CGI or database-delivery? Expect that some of the search engines won't be able to index them. Consider creating static pages whenever possible, perhaps using the database to update the pages, not to generate them on the fly. Also, avoid symbols in your URLs, especially the "?" symbol. Search engines tend to choke on it.
Search Engine
Spamming can also effect your web site
position.
Pick Strategic Keywords
For example, say you have a page devoted to stamp collecting. Anytime someone types "stamp collecting," you want your page to be in the top ten results. Then those are your strategic keywords for that page.
Each page in your web site will have different strategic keywords that reflect the page's content. For example, say you have another page about the history of stamps. Then "stamp history" might be your keywords for that page.
Your strategic keywords should always be at least two or more words long. Usually, too many sites will be relevant for a single word, such as "stamps." This "competition" means your odds of success are lower. Don't waste your time fighting the odds. Pick phrases of two or more words, and you'll have a better shot at success.
Position Your Keywords
Make sure your strategic keywords appear in the crucial locations on your web pages. The page title is most important. Failure to put strategic keywords in the page title is the main reason why perfectly relevant web pages may be poorly ranked.
Search engines also like pages where keywords appear "high" on the page. To accommodate them, use your strategic keywords for your page headline, if possible. Have them also appear in the first paragraphs of your web page.
Keep in mind that tables can "push" your text further down the page, making keywords less relevant because they appear lower on the page. This is because tables break apart when search engines read them. For example, picture a typical two-column page, where the first column has navigational links, while the second column has the keyword loaded text. Humans see that page like this:
Page 1
Page 2 Stamp collection is worldwide experience.
Page 3 Thousands enjoy it everyday, and millions
Page 4 can be made from this hobby/business
Search engines (and those with old browsers) see the page like this:
Page 1
Page 2
Page 3
Page 4
Stamp Collecting
Stamp collection is worldwide experience.
Thousands enjoy it everyday, and millions can be made from this hobby/business.
See how the keywords have moved down the page? There's no easy way around this, except to use meta tags. They will help for the search engines that use them. For the others, it may not be that big a problem. Consider how tables might affect your page, but don't necessarily stop using them.
Large sections of JavaScript can also have the same effect as tables. The search engine reads this information first, which causes the normal HTML text to appear lower on the page. Place your script further down on the page, if possible. As with tables, the use of meta tags can also help.
Search Engine Spamming: Don't Do it.
Also,
search engine spamming attempts usually center
around being top ranked for extremely popular
keywords. You can try and fight that battle
against other sites, but then be prepared to
spend a lot of time each week, if not each day,
defending your ranking. That effort usually would
be better spent on networking and alternative
forms of publicity, as described in Beyond
Search EnginesScribble170.
Consider search engine spamming against spam mail. No one likes spam mail, and sites that use spam mail services often face a backlash from those on the receiving end. Sites that spam search engines degrade the value of search engine listings. As the problem grows, these sites may face the same backlash that spam mail generates. The content of most web pages ought to be enough for search engines to determine relevancy without webmasters having to resort to repeating keywords for no reason other than to try and "beat" other web pages. The stakes will simply keep rising, and users will also begin to hate sites that undertake these measures.