How Search Engine Spiders See Web Pages

Dr Garcia’s article, The Keyword Density of Non-Sense, does two things really well:

1. It Explains how Search Engines look at web pages.

2. It cures insomnia.

Basically a spider/bot comes to your site, reads your page-source in order, and lumps everything together (termed: Linearization). So:

How some spiders see web sites

Then, it looks at the words around your keywords and assigns contextual values to the links. This is part of how Google and Yahoo assign different values to different links on a page.

So what? Well:

1. Use cascading style sheets (CSS) instead of tables. This will give you control over the order you deliver concepts to both the search engines and your users. CSS zen garden and a List Apart start to illustrate how robust the CSS can be.

2. You can use the CSS to hide or highlight portions of a page. Today you can go as heavy as you want with CSS spam without much fear of penalty. I will go into specific applications in another post.

3. Linking is more powerful when it appears in context. So links from within (apparent) body text will be given more weight.

4. Professional SEOs can start incorporating word pattern techniques, linearization strategies and local context analysis (LCA) into their optimization mix.

Both comments and pings are currently closed.

5 Responses to “How Search Engine Spiders See Web Pages”

  1. hermen shermen says:

    you can always pop that css file in your images directory … which so happens to be the same directory i always exclude in my robots.txt

    😉

  2. Tito3 says:

    I`ve reading this blog for a while and let me tell you , you rock.I`ve learnt a lot from this site.
    Do you think you could elaborate a bit on how this CSS spam works?
    I`m not looking to spam,but I want to get a few keywords in for adsense on one of my sites.Since my site is an image host there is not much I can do in terms of optimizing keywords for adsense.So, the ads suck.And I`m not getting many clicks.
    What I would like to do is sneak in a few keywords without my visitors seeing a bunch of non-related words on the site and get more attractive ads on the site.
    Thanks for any help you can give me.

  3. CountZero says:

    I always put the content first in the code, then navigation, footer etc.

    Tito3,
    You can check out the article I just wrote:
    http://www.blackhat-seo.com/2005/hide-with-css/

  4. […] Here’s an interesting look at the way search engine spiders actually read the data on your web pages from SEO BlackHat here. You can organize your data through a process of linearization which can increase the value of your existing on-page factors and the actual content. Everything will be placed in proper context and you’ll be able to control what the spiders see and how they see it by using the power of CSS layouts instead of tables in your web page design. […]