I have been, or can be if you click on a link and make a purchase, compensated via a cash payment, gift, or something else of value for writing this post. Regardless, I only recommend products or services I use personally and believe will be good for my readers.
There’s been talk lately that the Google Panda loves keyword-rich subdomains, so I decided to test this on my coupon website.
Aside from a WordPress blog, the pages on the coupon website are all driven from a coupon database. I use a system I wrote myself, although you can just as easily use a service like For Me to Coupon.
The merchant pages on the site look like this:
clubcouponcode.com/Flirty-Aprons-m925.php
The “m925” part in the filename tells my system this is merchant 925 in the database. If you change that ID, you’ll be shown another merchant’s coupons instead (and the URL will be corrected).
Previously, the full URL was www.clubcouponcode.com/Flirty-Aprons-m925.php, but I wanted to feed the Google Panda by putting the merchant name in the subdomain. Of course, adding all of those DNS entries wasn’t an option, and adding all of those ServerAliases in apache wasn’t an option, either. After all, there are nearly 3,000 merchants in the database. So to get over this hurdle, I’m using wildcard DNS.
The first step is to add an A-record to your zone file, such as:
A * 184.72.255.56
Of course, you want to point this to your server, not mine.
Then, you need to add the ServerAlias to the apache configuration. I’m using Plesk, so I create a vhost.conf file in /var/www/vhosts/clubcouponcode.com/conf which contains:
ServerAlias *.clubcouponcode.com
Now, you can go to anysubdomainyouwant.clubcouponcode.com and the site will come up.
The next step is to automatically redirect the pages to their new, keyword-friendly subdomain URLs. Here’s how I’m checking to make sure the user is on the right page, and redirecting if not:
$cURIActual = 'http://' . $_SERVER['SERVER_NAME'] . $_SERVER['REDIRECT_URL']; $cURIExpected = 'http://' . strtolower(simplify($rsMerchantData['cName'], true)) . '.clubcouponcode.com/' . simplify($rsMerchantData['cName']) . '-m' . $rsMerchantData['nMerchantID'] . '.php'; if ($cURIActual != $cURIExpected) { header("Location: $cURIExpected", TRUE, 301); exit(); } // ends if ($cURIActual != $cURIExpected)
Here’s the simplify function:
function simplify($cString, $bNoDashes = false) { $cString = str_replace("'", '', $cString); $cString = preg_replace("/[^A-Za-z0-9]/", "-", $cString); $cString = str_replace('--', '-', $cString); $cString = str_replace('--', '-', $cString); $cString = str_replace('--', '-', $cString); $cString = str_replace('--', '-', $cString); $cString = preg_replace("/\-$/", "", $cString); if ($bNoDashes) { return str_replace('-', '', $cString); } // ends if ($bNoDashes) else { return $cString; } // ends else from if ($bNoDashes) } // ends function urlfriendly($cString)
Hopefully you can follow the PHP coding.
So if you have a large database-driven site, you can use wildcard DNS to create the appearance of many, many subdomains. Just be sure to put checks in place, so you don’t have thousands of copies of the page across all of the subdomains.
Comments
LGR
I read that post over at SEOBook as well and while this might be true right now, Google is not going to keep it that way. Of course I do not know the mind of Google but I would say by creating hundreds and thousands of sub domains Google will flag the root domain as being web spam and then you could have bigger problems to worry about.
Eric Nagel
Hey Lee – yeah, I agree Google will adjust this soon, but I don’t think they’ll label this any more or less spammy than using mod_rewrite rules to make keyword-rich filenames.
Igor
“Just be sure to put checks in place..” – You might want to drop subdomains for your Privacy Policy and Disclosure pages. It’s not that important though. Just heads up.
Eric Nagel
Yeah, those should be noindex, even on the main www site. I’ll add it to my list
Joe Zepernick
Great post Eric. Are you throwing all those urls into a sitemap for the Google?
Eric Nagel
I’m not 100% sure how to do that. I don’t want to submit all of them… and I don’t think I can submit a sitemap index from the root.
If / once I figure it out, I’ll let you know
richard v
Eric, Any updates on SERP rankings after making the changes?
Eric Nagel
Here’s a look at Google Analytics – can you guess when I turned on the subdomains, and when I turned them off?
Funny thing… that’s all Bing traffic – subdomains had no impact on Google.
I turned them off to sell the site, but it looks like I should turn them back on again!