Page 1 of 1

Help with ROBOTS.TXT

Posted: Thu Feb 28, 2013 8:05 am
by tobul
Hi I want to block some links that google webmaster mark as duplicates titles.

I want to block all the searches links
/product/search&filter_tag=&limit=100&page=20‬
(for example)

also the sort. thanks

Re: Help with ROBOTS.TXT

Posted: Thu Feb 28, 2013 8:42 pm
by ChetanCx

Code: Select all

User-agent: *
Disallow: /*&limit
Disallow: /*&sort
Disallow: /*?route=checkout/
Disallow: /*?route=account/
Disallow: /*?route=product/search
Disallow: /*?route=affiliate/
Allow: /

Re: Help with ROBOTS.TXT

Posted: Thu Feb 28, 2013 9:52 pm
by tobul
but I use SEO URLS and dont have the "route="

Re: Help with ROBOTS.TXT

Posted: Thu Feb 28, 2013 10:15 pm
by ChetanCx

Code: Select all

User-agent: *
Disallow: /*&limit
Disallow: /*&sort
Disallow: /*checkout/
Disallow: /*account/
Disallow: /*product/search
Disallow: /*affiliate/
Allow: /