Archive for the ‘CSS Spam’ Category

CSS History Stealing Applied to Black Hat SEO

CGI security has an interesting write up on how to use Jeremiah Grossmans CSS History Stealing Trick. For those of you who haven’t already read, there is a way to check where your users have been by checking their CSS history. So, in answer to his August ponderism:

I wonder how long until the marketers start using this for additional visitor profiling. Feel free to view-source and find the trick.

Less than 2 months!

You run www.sitea.com and www.siteb.com and www.sitec.com are competitors of yours. Now you know these companies use www.ad1.com and www.ad2.com to serve up ads on. What you don’t know is how effective these ads are, simply put without direct access to the web server logs you can’t tell really. Well this isn’t entirely true!

Lets say VisitorA visits your site www.sitea.com. You can use the CSS history stealing trick to see if they have visited www.siteb.com and/or www.sitec.com. If they’ve visited a competitor you’ll know that this person is semi serious about whatever reason they’re visiting your site for. Using the same CSS trick you could also enumerate a list of links (only enumerated if the link was visited) against each competitor website to see what they viewed on this site. This could include seeing which products/services they are interested in, if they visited the ‘contact us’ page and possibly if they also visited the ‘thank you for submitting your data’ (Letting you know they submitted a form). Now that you know where your visitor has been you can utilize the same trick on websites advertising your competitors to see where they came from. Why bother? Well now you know which ads are in fact paying off for them and can advertise with the same company.

A more elaborate example would be dynamically generating a discount if the current visitor has visited a competitor potentially winning a deal.

So now you can map out all of your competitors and if a visitor has visited one of them and you know the price on that site to be X, you can sell for X-5 (for example). The possiable applications are endless . . . but are they legal? Donno. I doubt there are any laws written about grabbing someones CSS history. Any lawyers care to chime in?

How could SEO blackhats use this? Off the top of my head for manipulative “link trading”. To automate the process, you set up a script that grabs the URLs of all the places you send this email:

Hi,

I really like your site (sitename.com). In fact, I have put a link to your site from my site (as you can see here). Don’t feel obligated, but I would really appreciate a link back when you get the chance.

Thanks much!

-Fakename

Then the link is only served when thier site’s CSS is in thier history (which will be about 99% of the time).

Or, you could use the same application with trackback spam . . . or Referral Spam. Trackback someone and only serve a link if they have their own CSS in their history. That’s some dirty pool, but you can bet that people are going to be doing it because of how Google dramatically discounts reciprocal links.

The Most Cutting Edge SEO Exploits No One is Publishing

You know that the best SEO Black Hats are doing something more than scraping, using a site generator, comment spamming, and pinging to be raking in more than $100k per month.

But what is it?

Right now, there is way too much good stuff that I simply can’t publish on the SEO Black Hat blog. If I posted these tactics and exploits they would immediately get all the wrong kind of attention. The detailed conversations about how exactly to abuse search engine algorithms, generate massive traffic, and what other Black Hats are doing must remain underground to retain their effectiveness.

But what if I told you that you could discuss these exploits with me without paying my $500 an hour consulting fee? What if I told you there was a way to join in on the private, cutting edge discussions with some of the best Black Hats and web entrepreneurs in the world?

Would you be interested?

Because now you can . . .

Today is the official launch of the resource you’ve looked everywhere for but never found:

The Private SEO Black Hat Forum

Normally what you get on forums are people who don’t know anything talking with people who don’t want to say anything. You can occasionally find amazing tips on some forums: but you have to dig through 400 crappy posts just to find one post that is useful. That becomes a huge time sink.

How are the SEO Black Hat forums different?

Quality: We’re not going to have any contests to see who can make the most posts. That just creates tons of crap that no one wants to read. Our focus is on quality over quantity. Our primary concern is with succinctly answering one question: “What works?”

Sophisticated: Many of the topics we discuss are very advanced and require a high level of technical or business acumen to appreciate.

Expert Discussions: The SEO Black Hat forums are not for everyone and they may not be right for you. If you are relatively new to SEO or building websites, then do not join the SEO Black Hat Forums: you will be in way over your head. There are plenty of newbie forums out there for you – this is not one of them. Our forums are for successful web entrepreneurs to develop strategies that drive more traffic and generate more revenues.

Forum Membership Benefits

Access to Expert Advice and Discussions
We have both White Hat and Black Hat Experts that are already benefiting from new tool development, techniques, scripts and the sharing of ideas.
Some members you may already be familiar with include:

* CountZero from blackhat-seo.com (Black Hat)

* RSnake from ha.ckers.org (Web Security Expert)

* Dan Kramer from Kloakit (Cloaking Expert)

* Jaimie Sirovich from seoegghead.com (Token White Hat / SEO Geek)

There are several other members that you are certainly familiar with who are using handles for anonymity. We have others who are more focused on security, vulnerabilities, and coding. There are still more that you are likely unfamiliar with but are nevertheless web millionaires.

Databases – Large Datasets
If you want your sites to have massive amounts of unique content you need large data sets. The trading, discussion and posting of large data sets is going on right now on our forums.

Expired / Deleted Domain Tools
Want to use to use the same domain Tool that I used to get a Page Rank 6 site in the Gambling Space for just $8? This domain tool is available for members to use for free.

50% off on Kloakit – The Professional Cloaking Software

Scripts – Several useful scripts have already been posted – interesting thing you may not have thought of before are being discussed and developed.

Exploits and Case Studies: The really good stuff I can’t talk about on the SEO Blackhat Blog is being discussed on the SEO Black Hat Forums. Right now, some of the conversations include beating captchas, domain kiting, data mining, hoax marketing, XSS vulnerabilities as they relate to SEO, and much more.

Pricing: $100 per month.

The price will soon be rising significantly as more databases, hosted tools, scripts and exploits are added. However, once you lock in a membership rate it will never go up and you will continue to have access to everything.

So, if you think you’re ready for the most intense Black Hat SEO discussions anywhere, then here’s what you need to do:

1. Register at the SEO Black Hat Forums.

2. Go to the User CP and select Paid Subscription.

I’ll see you on the inside!

Link to Authorities and Hide These Links With CSS

Some site owners (blakhat and whitehat) are stingy with outbound links. They think that because the 1998 Google Algorythm paper implied that each outbound link bleeds pagerank, that this still applies today.

It does not.

In fact, our data and testing suggests that linking to 2-4 external subject authorities actually increases a pages credibility with the search engines.

How do you find subject authorities? For white hats, you should know what sites are relevant to your niche and link to them liberally.

For black hat seo, your site building script should include 1-4 links per page to subject authorities. Obviously, you want to automate this process.

These authorities could be:
1. Random search result from the top 50 Google results for the keyword phrases.
2. Links to the wikipedia
3. dmoz results
4. Old media news outfits (New York Times, Forbes.com . . . etc.)

Obviously, since you don’t want people clicking on these links instead of your PPC or affiliate program, you have several options. These links should be linearly (on page code) very close to your body text. However, you can uses the CSS to put the division with these links into cold zones on the page – or make them disappear entirely.

The white hat way to go is to use the CSS to put these links in a cold portion of the page. This article from Google includes this diagram:

Hot and Cold Zones of a web page for ad placement

The white zones are read least.

Here are two tips from Captain Obvious:

1. Don’t make these nofollow links – that defeats the purpose of linking to authorities.
2. Make the links target=blank so that if one of your surfers clicks it, there is still a chance that they will click on one of your ads too.

For the darker hats, you can use the following in your CSS to make a division invisible:

#important {
visibility: hidden;
}
or

#body{
display:none;
}

or

#main{
display:block
}

Notice I didn’t name the division something that screams I’m a spammer like “hidden”, “invisible”, “hide” or “HeyGoogleThisIsSpam.”

As Herman Sherman pointed out yesterday, “you can always pop that css file in your images directory … which so happens to be the same directory i always exclude in my robots.txt”

How Search Engine Spiders See Web Pages

Dr Garcia’s article, The Keyword Density of Non-Sense, does two things really well:

1. It Explains how Search Engines look at web pages.

2. It cures insomnia.

Basically a spider/bot comes to your site, reads your page-source in order, and lumps everything together (termed: Linearization). So:

How some spiders see web sites

Then, it looks at the words around your keywords and assigns contextual values to the links. This is part of how Google and Yahoo assign different values to different links on a page.

So what? Well:

1. Use cascading style sheets (CSS) instead of tables. This will give you control over the order you deliver concepts to both the search engines and your users. CSS zen garden and a List Apart start to illustrate how robust the CSS can be.

2. You can use the CSS to hide or highlight portions of a page. Today you can go as heavy as you want with CSS spam without much fear of penalty. I will go into specific applications in another post.

3. Linking is more powerful when it appears in context. So links from within (apparent) body text will be given more weight.

4. Professional SEOs can start incorporating word pattern techniques, linearization strategies and local context analysis (LCA) into their optimization mix.