Archive for the ‘Cloaking’ Category

The Most Cutting Edge SEO Exploits No One is Publishing

You know that the best SEO Black Hats are doing something more than scraping, using a site generator, comment spamming, and pinging to be raking in more than $100k per month.

But what is it?

Right now, there is way too much good stuff that I simply can’t publish on the SEO Black Hat blog. If I posted these tactics and exploits they would immediately get all the wrong kind of attention. The detailed conversations about how exactly to abuse search engine algorithms, generate massive traffic, and what other Black Hats are doing must remain underground to retain their effectiveness.

But what if I told you that you could discuss these exploits with me without paying my $500 an hour consulting fee? What if I told you there was a way to join in on the private, cutting edge discussions with some of the best Black Hats and web entrepreneurs in the world?

Would you be interested?

Because now you can . . .

Today is the official launch of the resource you’ve looked everywhere for but never found:

The Private SEO Black Hat Forum

Normally what you get on forums are people who don’t know anything talking with people who don’t want to say anything. You can occasionally find amazing tips on some forums: but you have to dig through 400 crappy posts just to find one post that is useful. That becomes a huge time sink.

How are the SEO Black Hat forums different?

Quality: We’re not going to have any contests to see who can make the most posts. That just creates tons of crap that no one wants to read. Our focus is on quality over quantity. Our primary concern is with succinctly answering one question: “What works?”

Sophisticated: Many of the topics we discuss are very advanced and require a high level of technical or business acumen to appreciate.

Expert Discussions: The SEO Black Hat forums are not for everyone and they may not be right for you. If you are relatively new to SEO or building websites, then do not join the SEO Black Hat Forums: you will be in way over your head. There are plenty of newbie forums out there for you – this is not one of them. Our forums are for successful web entrepreneurs to develop strategies that drive more traffic and generate more revenues.

Forum Membership Benefits

Access to Expert Advice and Discussions
We have both White Hat and Black Hat Experts that are already benefiting from new tool development, techniques, scripts and the sharing of ideas.
Some members you may already be familiar with include:

* CountZero from blackhat-seo.com (Black Hat)

* RSnake from ha.ckers.org (Web Security Expert)

* Dan Kramer from Kloakit (Cloaking Expert)

* Jaimie Sirovich from seoegghead.com (Token White Hat / SEO Geek)

There are several other members that you are certainly familiar with who are using handles for anonymity. We have others who are more focused on security, vulnerabilities, and coding. There are still more that you are likely unfamiliar with but are nevertheless web millionaires.

Databases – Large Datasets
If you want your sites to have massive amounts of unique content you need large data sets. The trading, discussion and posting of large data sets is going on right now on our forums.

Expired / Deleted Domain Tools
Want to use to use the same domain Tool that I used to get a Page Rank 6 site in the Gambling Space for just $8? This domain tool is available for members to use for free.

50% off on Kloakit – The Professional Cloaking Software

Scripts – Several useful scripts have already been posted – interesting thing you may not have thought of before are being discussed and developed.

Exploits and Case Studies: The really good stuff I can’t talk about on the SEO Blackhat Blog is being discussed on the SEO Black Hat Forums. Right now, some of the conversations include beating captchas, domain kiting, data mining, hoax marketing, XSS vulnerabilities as they relate to SEO, and much more.

Pricing: $100 per month.

The price will soon be rising significantly as more databases, hosted tools, scripts and exploits are added. However, once you lock in a membership rate it will never go up and you will continue to have access to everything.

So, if you think you’re ready for the most intense Black Hat SEO discussions anywhere, then here’s what you need to do:

1. Register at the SEO Black Hat Forums.

2. Go to the User CP and select Paid Subscription.

I’ll see you on the inside!

Great Cloaking Article

Finally something worth linking to. There has been a bit of a blog drought this summer in the SEO space and I’ve been struggling to keep seoblackhat interesting and informative the last few weeks.

Dan from kloakit has written the comprehensive guide to cloaking to clear up some of the misconceptions surrounding the topic.

Notice that search engine optimization, while displayed prominently in the list, is not cloaking’s only purpose. When used for this purpose, many folks see it as unethical and search engines don’t like it. I’ll discuss that later. But there are many good reasons for cloaking which have to do with assisting the user’s experience.

I mentioned that cloaking programs look for certain criteria in a visitor’s request for a web page. Here is a short list:

* Their IP address
* Their User-Agent
* The HTTP_REFERER header
* The HTTP Accept-Language header

He goes on to explain in detail all the list items plus language cloaking, the ethics of cloaking and how the search engines view the practice. If you already cloak, this is a good refresher. If you’re new to the topic, this is a great place to start or send others that don’t understand cloaking.

I mentioned the SEO blogging drought lately; if you write or see something that I should be blogging about – send it to me. I’m quadszilla and my domain is seoblackhat.com . . . so the email address shouldn’t be that hard to figure out.

IP Delivery to Stop RSS “Content Thieves”

Tired of getting your content stolen from your RSS Feed and reposted on splogs? Here is a simple solution for you. I was inspired by RSnake to add some code to my .htaccess file to stop some of the people from scraping my feeds and will show you how to do the same.

Basically, all you need is the IP address of whoever is stealing your feed and you can deliver whatever content you want to them. One way you can get it is to “ping” the site – go to a DOS prompt and type “ping spammersite.com”. It’ll spit out the IP for you. Traceroute (tracert) will also work.

In my case, I just redirected any instance from their IP address back on their own feed. I’m not sure yet, but this may cause a loop in there server to post things over and over again.

If you want to delivery any kind of custom content to a specific IP address, you just need to add these 3 lines of code to your .htaccess files.

RewriteEngine on
RewriteCond %{REMOTE_ADDR} ^69.16.226.12
RewriteRule ^(.*)$ http://newfeedurl.com/feed

Where 69.16.226.12= the IP address you want to send to and http://newfeedurl.com/feed is the custom content you want to send them.

You can always test what content will be delivered by changing the IP address to that of the machine you are working on. You can check your IP Address here.

You can be as creative as you wish with what you feed them. You can even use them to blog and ping for you if you like. The possibilities are endless.

So why is a site about Search Engine Spamming teaching people how to stop thier content from being splogged? Because I am a dirty link whore and this is the kind of thing that people like to link to.

It might even be the type of story that people like to Digg.

Dan Kramer Interview By Aaron Wall re: Cloaking

Aaron Wall Interview with Dan Kramer of Kloakit on Cloaking.

Google Bans NY Times For Cloaking!

Ha! That’s a headline that will never be true. Yup they still have their page rank 10 even though they are flagrantly violating Google’s Webmaster Guidelines.

Jamie Sirovich tells the tale perfectly in his piece, The Google Cloaking Hypocrisy:

There is no doubt about it. What the New York Times is doing, without special Google accommodations, or at least their complicity, is a black hat technique according to Google for everyone else. Other search engines are less quick to vilify cloaking, so long as it is not used to spam. I agree, but Google is in a pickle here.

It is pretty tough to get banned JUST for cloaking / IP Delivery. I think what Google really objects to is sending out spider food that is substancially different from a page’s content. If you want to use IP delivery for your flash site, or to highlight certain terms in the Title and H1s, odds are you’re going to be fine. Even sending out your paid content to the Googlebot, as the New York Times does, should not get you in hot water.

In fact, if you have paid subscription content and serve it to Google by cloaking and do get banned (and that’s the only Black Hat SEO you do), you could probably parlay it into some serious publicity because of the Google Hypocrisy.

It’s funny to still see people describe going outside of the Google Webmaster Guidelines as being “Unethical” as if Google is the final arbiter of Good, Evil and Morality. For those of you still in that camp, it’s time to wake up.

Rsnake on Detecting Cloaking in Apache

Rsnake is reporting that Apache Discloses information that could be used to detect cloakers:

Okay, but does that really help us? I mean, there’s no ETag at all right? Well, yes, and that’s the exact point. Because there is no ETag on in the header and there is for a confirmed normal file, you can tell that that page is dynamically created using mod_rewrite or a ScriptAlias. But now you’re asking, “What if you don’t know if it normally has the ETag at all, or more specifically what if the entire htdocs directory is dynamic?” How about trying a file that is always there and lives outside of the htdocs directory? The Apache logo that is included with the base install inside the /icons directory definitely qualifies.

Kloatit Includes Mediabot IP Range

We have confirmed that the professional Cloaking / IP delivery software, Kloakit, includes the Mediabot IP ranges. This way, in case Google is comparaing the content served to the Googlebot and Mediabot and penalizing if it is different, you will be covered if you are using kloakit and send the same content to both.

Important Information For Cloakers

Shoemoney is reporting that :

Matt Cutts confirmed the recent rumors about media bot results getting into Big Daddy. Matt said it is a bandwidth saving feature to have GoogleBot and MediaBot both contributing to big daddy. Matt also stated that you will gain zero advantage in search listings however if you are serving different content to MediaBot then to Googlebot then you could be in trouble.

So if you are cloaking, make sure you are sending the same info to both the mediabot and the Googlebot. Any of you have IPs for mediabot?

Webmaster World – I think I get It

At first, I didn’t get it. While some of us thought he had lost his mind, we knew that Brett Tabke was no dummy. You don’t build the most successful webmaster forum on the web without understanding a thing or two about a thing or two.

As you may already know, he is now writing a blog on his robots.txt file (which is pretty funny). Clearly, he is cloaking to send the real robots.txt file to the search engine bots. Google, Yahoo, and MSN all know this – they don’t care. Getting banned just for cloaking is a myth.

As I see it, the bandwidth from the real search engine bots is obviously an acceptable cost of doing business. It’s all those other bots from scraper sites that you want to get rid of: the ones hitting 4 million pages per day each and pirating your content.

This blog entry about bot behavior and his experiment was interesting:

# After alot of testing and bot busting, the current robots.txt is what was
# settled on. I felt exposing the code was the best way to explain it all
# (see the actual robots.txt above for the full story).
#
# Testing the bots code and the security code to get it all right took alot
# of time. In the end, we found:
#
# – A surprising 21 bots that were following all the active list posts on a
# daily basis and downloading that content.
#
# – About 45 trademark and other page monitoring services. The majority of
# those monitoring services obey robots.txt.
#
# – 15 bots would accept cookies.
#
# – 2 more web sites reselling WebmasterWorld content. One in China and one
# in the stans. both out of legal reach.
#
# Sorry Shak – China will continue to be viewed with suspicion as long as
# it is still the wild-wild-west out there with few legal controls to
# protect content.
#
# – about 30 people who don’t understand the concept that if you look like
# a bot with a spoofed agent name – you are a bot.
#
# – Some of the worst bot running offenders? A few choice SEO’s.
# These are the same folks that bring you click bots and scrapper sites.
# I think they have little respect for other peoples content. I also think
# they don’t appreciate just how impactful their actions can be.
#
# – http://www.ojr.org/ojr/stories/051213niles/
#
# Thanks to Yahoo and MSN for the permission to treat your bots as if they
# were a tough steak during the testing and coding phase. Cloaking stuff for
# testing went a long way to being able to figure out the right
# balance of settings.

I don’t think Tabke was banned, or hacked, and it wasn’t to see if he could do without search engine traffic out of pride. Real search engine bots he can deal with, it’s all those other bots that he wanted to eliminate.

It was a strategic business decision to get rid of bandwidth leaching non search engine bots of people who were taking his paid content for free: one that will likely pay off in the long term

Free Cloaking Script

You’re broke as a joke but want to cloak: So what can you do? How about a free cloaking script?

Let’s say you’ve used widgetbaiting or the markov chain to create 30,000 pages of unique content about bacon polenta recipes. Of course, no human surfer wants to read those pages but they are great spider food.

Well if you don’t want to use IP delivery like you’re supposed to, you can use this code to send your surfers to a sell page with text written for human consumption.

Now, this is not some unsneaky java redirect that will get you banned in the Search Engines. * If you use this code, you may get banned in some search engines.* Rather, it’s a error loophole designed for you to exploit:

<img src=nofilehere.gif onerror=window.open(‘http://seoblackhat.com’,’_top’)>

Just make a page with any kind of spider food / keyword spam that you want on it and then add that line to the page.

When surfers visit the page, they will be sent to “seoblackhat.com” because the requested image file does not exist (therefore there will be an error). The spiders and search engines, on the other hand, will all see the original page.

This free cloaking script is inferior to premium cloaking software for many reasons. If you are scraping content, this method does nothing to help you get past duplicate content filters. This free cloaking code does not protect your code from surfers or your competition. Surfers will briefly see these spider food pages load. They may, in turn, report you to the search engines who could decide that using this code in the manner described is abusive. So, I would not recommended it for sites that you cannot afford to have banned.

Many high profile sites and fortune 500 companies use Cloaking to send different content to different IP addresses. But they don’t use code like this or cheesy redirect scripts – they use sophisticated cloaking software – IP delivery is the safer and preferred way to cloak. Honestly, I’ve never even heard of someone actually getting banned just for IP cloaking. I know that people do get banned for using crappy JavaScript redirects but in my opinion, getting banned for IP Cloaking is one of the great Black Hat SEO myths; it just doesn’t happen.