Loading...

BigCommerce SEO mysteries: is the temp URL a wise decision?

Manuel C. Published on 24 May, 2012

At account creation each BigCommerce client receives an email with an URL like this: store-XXXXX.mybigcommerce.com. That is the temp URL. Is it an SEO issue or not?

Once you have that temp URL you know you can reach your site using different URLs. Now the basic question is this: since I have products and already submitted my sitemap to Google won't the temp URL show as a website with duplicate content?

Or worse, will my actual store, found at either [yoursubdomain].mybigcommerce.com or [yourstore].com, be seen as having duplicate content FROM the temp URL? Things like these can scare you, but the answer is: NO.

No, Google won't see one as a copy of the other. The only thing you need to do is to block crawlers until you finish setting up your store and then put back the default lines from Tools - Edit Robots.txt.

Multiple versions of your store are helpful

At least in BigCommerce's setup, that is. There are two main reasons why the temp URL is a well thought endeavor:


  • it is a good fallback method - if you do a Move to domain action and something goes wrong along the way you can still access your store using the temp URL

  • you get to use BigCommerce's shared SSL for the secure pages - every ecommerce store needs to have secure pages for login, account check, checkout and payment detail insertion. And these need a valid SSL certificate. Using the temp URL on secure pages you can have the security you need without no other costs.

Having an ecommerce store isn't as easy as you'd think. Technically, there are many modules that need to be developed. And security is one of them.

The only way you will be able to NOT use the temp URL is when you install an invalid/expired SSL certificate. That will need the help of the tech support team since no one will be able to get into BigCommerce store's admin panel.

Still, where is the explanation about the duplicate content issue and stores found at temp URL?

Why is robots.txt so important

Yes, one simple file can do amazing things. The Robots.txt file will allow or disallow crawling your site by the search engines that will obey it. Some "rogue" crawlers might crawl your site even is they are not allowed. That doesn't happen with Yahoo, Bing or Google.

Basically, the robots.txt ensures that only the PUBLIC part of you website is crawled and all the other parts are disallowed. Here is how you can test this:


  • http://store-XXXXX.mybigcommerce.com/robots.txt - all is disallowed

  • httpS://store-XXXXX.mybigcommerce/robots.txt - all is disallowed

  • httpS://yourstore.com/robots.txt - secure part of your store, all disallowed

  • http://yourstore.com/robots.txt - usually the default file which allows your site to be crawled

How does one disallow all crawlers in Robots.txt? Below is the code:

It is a good practice to block all while you setup your store, but do not forget to put back the default code which is the one from below:

There you have it. The temp URL does not generate duplicate content issues because Google is not allowed to crawl it. Happy BigCommercing.

Manuel C. Published on 24 May, 2012