It is a well known fact that Google likes fresh and original content. Duplicate content is not an option when fighting for the best positions in Google's SERP - Search Engine Results Page.
Given the fact that BigCommerce is a dynamic platform you may encounter pages with various parameters that can be seen as duplicate content in Google's index.
The reason why this happens is that, when generating the dynamic page, the store retrieves the different parts of that page form the database and thus builds a completely new page with content that is already on the site. The pages may seem identical, but they are duplicates. Duplicates is what you need to avoid.
Luckily we have two very useful tools to defend our selves from being dropped in Google's results: the robots.txt file and the parameter declaration in Google Webmasters Tools.
Fortunately BigCommerce stores come with a base robots.txt file that saves us a lot of trouble, but it may still need some tweaking. You can fine tune the way the crawlers (bots) index your site by altering this file, but I will outline here the usage of parameters in your Webmasters account.
If you have links like the ones below they may be seen as duplicate content in store and receive warnings for them:
The logical step here is to block the bots from crawling those links, thus preventing them from showing up in Google index.
First, you need to identify what a parameter in your URL is. Parameters are the words that can be found in the URL after these signs: ? and & .
In the above links the parameters are:
Another example: http://site.com/example-post/?like=1&_wpnonce=72af2cd803
Here the parameters are:
So, how do you block the URLs like these from being indexed?
First, you need to log into you Google Webmasters Tools account, locate tab Site configuration, then choose URL Parameters.
From there click on Add Parameter, insert the parameter found in the duplicate links and then select "Yes: Changes, reorders, or narrows page content", select "Other" from the next drop-down box and, more importantly, choose the option "Let Googlebot decide"! Use this options if you are unsure how the parameters affects the content of the pages found at those URLs.
If you setup the parameter setCurrencyId in Webmasters Tools then Google will not crawl those URLs.
For a detailed explanation of all the terms found in URL Parameters section please read the post from Search Engine Land about parameter usage.