MarketPosition(tm) Monthly May 1998

Issue Now 35,000 Subscribers and Growing!
" ...because submitting to search engines is just NOT enough."
Techniques for Search Engine Positioning to Build Site Traffic

IN THIS ISSUE:
  • Why some pages rank high for no apparent reason!
  • InfoSeek stacking the Deck?
  • The Good News
  • WebPosition 1.20 Released
  • Newer is Better.

    Why some pages rank high for no apparent reason!

    We've all been there: You do a search for a keyword that should be relevant to your Web site and a page appears near the top of the results for no logical reason. In many cases, the keyword queried isn't even on that top ranking page! At best, the keyword appears just once or twice while your Web site has been carefully constructed to incorporate that keyword several times - and it still ranks lower than this mystery page! Stop banging your head against the wall - there are reasons. Once you learn them you can combat these pages. Here are some things that can cause a Web page to rank higher than yours even when it doesn't appear to be optimized for a particular keyword or phrase:

  • 1. Out of date pages: A Webmaster can change a page on their Web site. Unless that changed page is resubmitted, the search engine may not know to re-visit the page, spider it and update its index for quite some time. Even if a Webmaster resubmits their page, some engines take weeks to re-visit and re-index a site.
    What can you do about this kind of situation? If this is, in fact, the problem, you can go to the engine's submit URL page and submit the offending page's URL for re-indexing. After being re-indexed, the page should drop in rank. It may take a few weeks until the site is re-indexed, but taking the initiative and re-submitting a site should accelerate the process. Infoseek, AltaVista, and HotBot, will typically re-index a submitted site within 2-3 days or less.

  • 2. Dynamic Page Substitution: Some Webmasters create scripts on their Web site's server that can literally detect the IP addresses or the "browser name" of a search engine's re-indexing spider visiting their site. When this script detects a search engine's spider it "serves" a different Web page than the one you or I would see after clicking on the link from the search engine's results! There are a few clues that tell you this technique is being employed:

    A. The page you see does not contain the words used to describe it in the search engine's listing. The words used as the site's description in the search engine's index are *always* taken from the actual page itself - from the meta tags or actual copy that makes up the site. If the words found in the search engine's listing for that site are close, but not an exact match for any text on that site or in its meta tags, suspect this technique.

    B. The text found in the <TITLE> tag of the Web site is different than the what the search engine's listing uses as the site title (usually the text that is represented as a blue hyperlink to the site). About half of the major search engines use only the text found in the site's title as their title for the site. Or, as above, the text doesn't appear anywhere on the page - in the <TITLE> tag or elsewhere.

    C. You submit the URL for re-indexing (use InfoSeek to test because it takes only minutes)and, once re-indexed, the site's position does not change, AND, the title and description used by the search engine hasn't changed to reflect what you know is actually on the page you were viewing.
    Typically, the page "served" or shown to the search engine's spider is a rather unattractive page. Often these pages are much like any other doorway page that you might create to secure a top ranking - they're primarily text, with high keyword frequency, prominence and weight. The technique they're employing simply hides these pages from the general public - presumably because the optimized page isn't all that attractive, or, so that their "secret techniques" cannot be copied by others seeking to outrank them.
    Sometimes this technique can be abused and used to hide pages that employ blatant keyword stuffing, spamming, or other inappropriate techniques. InfoSeek does not allow the technique and will remove pages from their index that use it. Personally, I feel the technique of swapping pages can degrade the search engines, and should not be used since the results the user sees are not what the search engine actually considered "relevant".
    That can of worms aside, your problem is that you need to outrank these hidden pages. To start with, relax, and recognize that this technique gives them no magic bullet advantage. They still have to build a high ranking doorway page to serve to the search engine's spiders. The difficulty in outranking them is that you cannot view their HTML source and check their keyword frequency, weight and so on. You can still tweak your page's keyword content to out score them.
    One trick that most people overlook is to review the source code of another page that is ranked higher than one that is hiding the actual doorway page. After all, if a page outranks one using this substitution technique it must have higher concentrations of keywords in the right places. Review this higher ranking page and base your strategy on that page instead - problem solved. You can also alert the search engine by e-mail that a Web site is using this technique and, depending on how they feel about it, they can verify that this is the technique being used and remove the page's listing in their index.

  • 3. The "Ol' Switcheroo" technique: This technique involves building a page optimized to earn a top ranking and then swapping it out for your "real page" once the site has been indexed. This is a sleazy technique - but easily addressed. Like the page substitution technique described above, you detect that this technique is being employed by looking at the listing in the search engine and then comparing it against what you see on the Web site. If the two don't match, e.g., the title or site description are not found on the actual Web site, chances are this technique was probably used. Unlike the dynamic page substitution technique, when you re-submit the URL to the search engine it will likely drop in rank and the new listing will include copy found on the actual Web page. Resubmitting pages you find using this technique usually causes them to tumble down the search results to a position that doesn't compete with yours.

  • 4. New ranking algorithm: Search engines change their ranking algorithm from time to time. Techniques that worked well last month, may not be as effective today. Search engines can take some time before they apply a new algorithm to their entire index. Until this happens, some older pages may continue to rank high, even though your submission modeled after their success don't score well.
    The solution again: submit the page. This should cause the search engine to apply the "new" rules to the page such that it is measured under the same relevancy system as your page. Once this has occurred, you will find out the "true" rank of the page in question, and you can be certain you are modeling your page after pages that are ranking well under current page scoring rules.

  • 5. Page Popularity: Another reason pages that don't seem particularly optimized for a given keyword rank well is that hundreds or even thousands of other Web sites have established links to them. Some search engines consider "page popularity," or, how many other Web sites have linked to a particular page in determining how relevant the page is.
    To determine if this measure is keeping another page ahead of yours in search results, you'll need to do a "Links to URL" search.
    A number of engines support a "Links to URL" search on their "Advanced" search options page. If not, some allow you to type the word "link:" and then your URL into the search field to return a list of sites that the search engine has recorded as linking to yours. If you have WebPosition with our "Secrets to Achieving a Top 10 Position" guide, you can look up the syntax for each engine. The guide also details which engines are believed to use popularity in their relevancy system.
    The popularity measure is another reason to spend part of your marketing effort soliciting links from other sites. I'm not entirely certain if the search engines differentiate between links to your root domain page as opposed to internal pages. I suspect they only consider the number of links to a specific page. If you have an opinion about this, let me know.

  • 6. Search Engine Bugs: Yes, even the big commercial search engines have bugs. Since they are continually trying to fine tune their system to provide better results, or to beat back the spammers, software glitches or "bugs" can easily make their way into the database. Sometimes it will be corrected quickly but in other cases it may score pages incorrectly or poorly for quite some time.
    About all you can do in this situation is to alert the search engine that xyz pages rank high on xyz search, and that they really are not relevant to that particular keyword search. The "smart" search engines will listen and look into why the search results were poor. When people don't find documents they feel are relevant to keywords they queried, they frequently try again on another engine. Search engines don't want that because they make their money on advertising to those visitors.

  • 7. The page is simply well optimized: Often the reason a page ranks high is it simply fits the criteria that a search engine is looking for.
    The search engine's algorithms are fairly sophisticated so sometimes it takes a second look to understand why a page is positioned where it is. A number of factors affect search relevancy including keyword "weight", "prominence", "frequency", and avoiding techniques like repeating keywords too many times - a.k.a "spam." All these issues are discussed in simple, easy to understand language in past issues of this newsletter, as well as in our comprehensive 110 page report. This report is included FREE when you purchase WebPosition.

    If you would like to review this intense report in all its glory, simply order WebPosition for $99. If you aren't completely satisfied with the product, you can return it and keep the report for FREE. You won't beat that kind of guarantee anywhere!
    http://www.webposition.com/easy-order.htm

  • 8. Reviewed Pages: The final reason a page can rank in the top 10 to 30 matches for no apparent reason is that a human being put it there. Simply put, some search engines have employees whose job it is to review Web pages and pre-list them in their top rankings just so their search engine has more "good stuff" near the top - where most users look. See the next article for a case example.

    Infoseek Loading the Deck

    The first time I heard about this from a close associate at the end of February, I thought it to be just a rumor. However, I got confirmation from a second associate that it was true in mid-March.
    Apparently, Infoseek, in their quest to make relevant sites appear higher for popular keyword searches, are "stacking the deck". At least for some popular keywords they employ "editors" to look for pages of reasonable quality that seem relevant to a given topic. They then force these sites to the top of search results for the appropriate keywords, or at a minimum, assign extra "weight" to those pages.
    In my opinion, looking at reviewed sites are great for many types of searches. However, these artificially elevated pages should be identified as such in the index and remain in the "sidebar" area of the results page. Most engines do this with "channel" or "category" areas. This combination offers the best of both worlds in my humble opinion.
    Sneaking Web pages into actual keyword/phrase search results, without at least highlighting them as "reviewed" or putting them in a sidebar area, will simply confuse people. It seems a poor substitute for improving the search logic, something most people would prefer.
    For the average Web site owner, this means pages for some keywords, at least on Infoseek, may rank in the top 10 to 30 positions, even though the actual frequency and use of a keyword doesn't necessarily justify it's rank. You might be thinking, if this is true, how can I get my site reviewed so that it can get an artificial boost?!?! First, make sure your site looks PROFESSIONAL, and then suggest that your site be added to the appropriate channel category by visiting this page:
    http://www.infoseek.com/Comments?pg=DCcomments.html
    If they choose to put you in a category, you may get additional weight in the regular searches as well.

    The Good News

    Despite the various problems and issues with achieving a high ranking, the good news is that high rankings are still very achievable. There are literally 1000's of keywords and keyword combinations being queried, some being queried thousands of times a day. There are probably dozens of phrases relevant to your site. Only a handful of those keywords fall into the "high interest" category, subject to editorial "review" and being artificially elevated or given a nudge up. Chances are, unless you're targeting a single, very general keyword like "travel" or "computer," you won't have to compete against listings that can't be beat. In addition, most engines are reviewing sites only in their separate "channel" areas, and not for the general search results.
    Your efforts to draw waves of new traffic to your site by improving your search positions is still a *very* effective use of your marketing time and an achievable goal. Most pages that rank high are optimized pages, or pages that just happen to follow all the rules. However, you must know the proper techniques, and, have the right tools to do it efficiently and to give yourself a chance to be found. Search engines are one of the last level playing fields left where both small and large sites alike can compete.
    For people with an entrepreneurial spirit, this spells: o-p-p-o-r-t-u-n-i-t-y. To be successful, you first have to be found. The most common way people go looking for a particular kind of Web site or topic is by using search engines. The only realistic way to make it into the top positions is to optimize your pages; check your rank with WebPosition; then optimize those pages again to climb higher. You'll then see targeted visitors coming to your site everyday that are looking for your product or service.

    WebPosition 1.20 Released

    FirstPlace Software is proud to announce the release of an incremental upgrade to WebPosition Analyzer. The 1.20 update, which is free to all current users, adds two major search engines, NorthernLight and Magellan.
    Additionally, the update discontinues tracking positions in the OpenText engine which no longer accepts submissions from Web sites. URL searching in WebCrawler is now supported by WebPosition as well as several other enhancements and fixes.
    WebPosition now supports 11 search engines, which combined probably make up 95% of all search engine traffic. Existing users can download the update from:
    http://www.webposition.com/updates.htm
    If you don't currently have WebPosition installed but wish to try the new version for FREE, you can download a free trial copy here:
    http://www.webposition.com/download-it.htm

    Newer is Better

    Here's a simple tip to help your page get a little extra "boost" up the ladder. Several engines are known to score newly submitted pages a little better. Therefore, if you need a little extra boost, try re-submitting the page. You might make a slight change such as changing a "Last Revision" date at the bottom of the page, so the content looks "new" to the search engine's spider. This tip alone won't move you to the top, but it can help when combined with other accepted techniques.

    LAST MONTH

    Last month I talked about several important topics including:- AltaVista Returns Random Results- Warning About Double Title Tag- New Rules at Infoseek & AltaVista- HotBot Tips
    If you missed these or other key discussions, you can find the back issues at:
    http://www.webposition.com/newsletters.htm

    LET ME KNOW WHAT YOU THINK

    I certainly hope you find this newsletter of value in your marketing efforts. If you have any suggestions, tips, or other comments, just REPLY to this e-mail.

    ABOUT THE AUTHORS

    MarketPosition is written by Brent Winters, President of FirstPlace Software, with editing and contributions by Frederick Marckini, President of Response Direct, Inc.

    OTHER RESOURCES

    FirstPlace Software produces several products including WebPosition, the first software program to report your search positions on the major search engines and to help you in improving those positions.
    You may download a FREE trial of WebPosition at:
    http://www.webposition.com
    You may call us at 1-800-962-4855 if you have questions not addressed on our site. You will also find an array of additional tips and techniques for improving your search positions in both the WebPosition Help File and the Reports it generates.
    FirstPlace Software also offers a complete report on search engine positioning entitled "Secrets to Achieving a Top 10 Position". This 110+ page report compiles all the latest information about the major search engines and how you can improve your positions in each. Currently, as a special bonus offer, this $79 report is included FREE with your purchase of WebPosition.

    SUBSCRIBE

    To subscribe to MarketPosition, simply e-mail: subscribe@webposition.com
    (c) copyright 1998 FirstPlace Software, Inc.