//////////////////////////////////////////////////////////////////////////// ///////////////////////

ADS

Your Ad Here
Your Ad Here

Something You Need To Know About Duplicate content And CSS

My last three posts covered a variety of questions regarding keyword usage, links and website architecture.. In this post I'll address the final question that has to do with the visual display of your pages, duplicate content and CSS.


While on the BBC website I noticed that they have an optional low graphics version for all of their pages. I am not sure how they do this but I decided I could do the same by making a low graphics imitation of each of my pages by having a button on each page that could allow people to switch back and forth. I have made all my low graphics pages not only with no or low graphics but with all web safe colors and web safe fonts. The text on the low graphics pages are identical to my regular pages.
Is this considered duplicate content? Will this hurt my search rankings? if it would hurt my ranking could I avoid that by using no follow tags on the links to the low graphics pages or would they still get indexed and subsequently hurt my rankings? Is there something I could inset in a robot text file so that the spider would not go to those pages at all?


They way this is done is by creating multiple Cascading Style Sheets (CSS). BBC.com allows its users to change display settings in a variety of ways, including six pre-set options. Each option, once selected, imports a different style sheet used to display the page. The page URL remains the same, but the way the content appears changes.


By using CSS to change the display BBC.com, or any site really, can have unlimited viewing options without creating any duplicate content issues. Based on the question above, the way the low-res version was implemented will produce multiple pages (URLs) that uses the same content that is also used on the "normal" version. This creates duplicate content that will potentially be a problem for the search engines. And yes, it could effect your rankings.


If you want to implement multiple layouts of your pages, or even a printer friendly version, CSS is the way to go. It's the easiest and cleanest way, and doesn't allow for any potential duplicate content issues. However, if you want to do things the hard way, there are a couple things you can do that will help prevent such duplicate content problems.


The first, as mentioned in the question, is to use the nofollow tag. All links pointing to these alternate versions should be nofollowed. I'd also back that up with a robots.txt exclude file.


Secondly, you could implement the canonical tag. In the <> section of your code place the following:


< rel="canonical" href="http://www.site.com/original-page.html">


This tag would need to be placed on each alternate version of the page with the link going back to the main version. This will tell the search engines that all other versions are not the real one and therefore should not be considered duplicate content.


These are band-aid solutions and I wouldn't recommend them. Creating unique CSS is simpler, cleaner, and ultimately the more perfect route to go.

0 comments:

Blog Widget by LinkWithin