Check out this week’s whiteboard friday on SEOmoz. A series of very interesting “black hat” disclosures, particulary the cookie handling issue for affiliate marketers and a “point your DNS at someone’s web server to steal their links” tip. Genius, if that’s the kind of thing that floats your boat.
Anyways, in response to the “point your DNS at someones website and steal their links” idea, I suggest that SEO consultants need to talk to their webserver admins a lot more. Think “SEO hosting” and you’re almost there. The original issue was highlighted as merely a security flaw that needed more attention, but I’m not suprised it’s been translated into a black hat link building technique. It’s exactly this kind of issue that gets overlooked because the people who manage the web server administration likely don’t think like SEO’s.
How can you tell if this problem could affect you, and what could you do to fix it?
The test: Don’t go and buy a domain and point it at your website. That’s a bit of a dangerous test for your brand, wouldn’t you say? Instead go to your hosts file and add a domain name eg:
www.richardsdnstest.com and add the resolving IP matched to your own webserver IP.
Open up IE and go to that address. If your website displays, you have a problem.
The solution: you should use host headers configured via IIS to protect yourself against unknown domains resolving at your web server IP. If you tell this to a webserver admin, they’ll run off, do their homework and come back with a solution that works.
There was a comment on the post that even if you set up on host headers (and therefore serve a host not recognized error instead of your website) a proxy service could be used to get round the issue. Proxy servers can be blocked (though this is much harder) through the following methods:
1. An application gateway firewall can be configured to block known proxy IPs and detect unknown proxy IPs
2. .ht access can block known proxy IPs (as can ISAPI)
3. Some proxies even have their own user agents. You could choose not to serve content when the user agent detected is non browser based or a search engine. User-agents.org has a comprehensive database available with an RSS feed for new bots.
In summary, the wider white hat SEO community would never carry out this kind of linkbuilding technique. It’s inethical and quite fraudulent. Anyone with decent link monitoring would catch this out reasonably quickly, certainly before losing a few links did any real damage. I wouldn’t want to associate my agency brand with this type of linkbuilding technique and I certainly wouldn’t want to be the one dealing with the relevant legal charges when they were caught!
SEO’s should definitely talk to their web server admins a bit more often. They’re geeks, too.