Product search URLs with parameters and pagination issues - how should I deal with them?
-
Hello Mozzers - I am looking at a site that deals with URLs that generate parameters (sadly unavoidable in the case of this website, with the resource they have available - none for redevelopment) - they deal with the URLs that include parameters with *robots.txt - e.g. Disallow: /red-wines/? **
Beyond that, they userel=canonical on every PAGINATED parameter page[such as https://wine****.com/red-wines/?region=rhone&minprice=10&pIndex=2] in search results.**
I have never used this method on paginated "product results" pages - Surely this is the incorrect use of canonical because these parameter pages are not simply duplicates of the main /red-wines/ page? - perhaps they are using it in case the robots.txt directive isn't followed, as sometimes it isn't - to guard against the indexing of some of the parameter pages???
I note that Rand Fishkin has commented: "“a rel=canonical directive on paginated results pointing back to the top page in an attempt to flow link juice to that URL, because “you'll either misdirect the engines into thinking you have only a single page of results or convince them that your directives aren't worth following (as they find clearly unique content on those pages).” **- yet I see this time again on ecommerce sites, on paginated result - any idea why? **
Now the way I'd deal with this is:
Meta robots tags on the parameter pages I don't want indexing (nofollow, noindex - this is not duplicate content so I would nofollow but perhaps I should follow?)
Use rel="next" and rel="prev" links on paginated pages - that should be enough.Look forward to feedback and thanks in advance, Luke
-
Hi Zack,
Have you configured your parameters in Search Console? Looks like you've got your prev/next tags nailed down, so there's not much else you need to do. It's evident to search engines that these types of dupes are not spammy in nature, so you're not running a risk of getting dinged.
-
Hi Logan,
I've seen your responses on several threads now on pagination and they are spot on so I wanted to ask you my question. We're an eCommerce site and we're using the rel=next and rel=prev tags to avoid duplicate content issues. We've gotten rid of a lot of duplicate issues in the past this way but we recently changed our site. We now have the option to view 60 or 180 items at a time on a landing page which is causing more duplicate content issues.
For example, when page 2 of the 180 item view is similar to page 4 of the 60 item view. (URL examples below) Each view version has their own rel=next and prev tags. Wondering what we can do to get rid of this issue besides just getting rid of the 180 and 60 item view option.
https://www.example.com/gifts/for-the-couple?view=all&n=180&p=2
https://www.example.com/gifts/for-the-couple?view=all&n=60&p=4
Thoughts, ideas or suggestions are welcome. Thanks!
-
I've been having endless conversations about this over the last few days and in conclusion I agree with everything you say - thanks for your excellent advice. On this particular site next/prev was not set up correctly, so I'm working on that right now.
-
Yes I agree totally - some wise words of caution - thanks.
-
thanks for the feedback - it is Umbraco.
-
To touch on your question about if you should follow or nofollow links...if the pages in question could help with crawling in any fashion at all...despite being useless for their own sake, if they can be purposeful for the sake of other pages in terms of crawling and internal pagerank distribution, then I would "follow" them. Only if they are utterly useless for other pages too and are excessively found throughout a crawling of the site would I "nofollow" them. Ideally, these URLs wouldn't be found at all as they are diluting internal pagerank.
-
Luke,
Here's what I'd recommend doing:
- Lose the canonical tags, that's not the appropriate way to handle pagination
- Remove the disallow in the robots.txt file
- Add rel next/prev tags if you can; since parameter'd URLs are not separate pages, some CMSs are weird about adding tags to only certain versions of parameter
- Configure those parameters in Search Console ('the last item under the Crawl menu) - you can specific each parameter on the site and its purpose. You might find that some of these have already been established by Google, you can go in and edit those ones. You should configure your filtering parameters as well.
- You don't want to noindex these pages, for the same reason that you might not be able to add rel next/prev. You could risk that noindex tag applying to the root version of the URL instead of just the parameter version.
Google has gotten really good at identifying types of duplicate content due to things like paginated parameters, so they don't generally ding you for this kind of dupe.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Some site's links look different on google search. For example Games.com › Flash games › Decoration games How can we do our url's like this?
For example Games.com › Flash games › Decoration games How can we do our url's like this?
Intermediate & Advanced SEO | | lutfigunduz0 -
New-york-city vs. broadway as a URL parameter
We're a content publisher that writes news and reviews about the theater community, both in New York City (broadway mainly) and beyond. Presently, we display the term 'new-york-city' in news articles about Broadway / New York City theater (see http://screencast.com/t/XlifMdT9QP). Would it be better for us to replace that term with simply 'Broadway' to improve its searchability? I was doing some google trends keyword research and it looks like the search term "Broadway" in various permutations is substantially more popular than "New York City Theater."
Intermediate & Advanced SEO | | TheaterMania0 -
Is it a bad idea to use our meta description as a short description of a product on that product page?
Does this count as duplicating content even though the meta description has no effect on search results?
Intermediate & Advanced SEO | | USAMM0 -
Do Local Search Efforts (Citations, NAP, Reviews) have an impact on traditional organic search listings (without the A, B, C mapping icons), but rather the traditional listings?
Are citations, NAP, Reviews, and other local search efforts impact traditional SEO listings? Can one elaborate?
Intermediate & Advanced SEO | | JQC0 -
Linking to urls with Query Parameters good for SEO?
Hey guys, I am currently buying link ad spots on sites (hardcoded, not using ad networks). I track the each link I buy and the sales they generate with query parameters such as : http://www.mydomain.com/?r=top_menu_nav_on_seomoz My question is : do these links still pass link juice? I have my canonical already set to http://www.mydomain.com Also, in Webmaster tools I have it set to ignore anything after /?r= The way I see it, a link is a link. Naturally I would prefer to send directly to my root domain, however, these links cost a lot of money and I like to track my results. Does anyone have experience with SEO and working with query parameters?
Intermediate & Advanced SEO | | CrakJason0 -
URL for offline purposes
Hi there, We are going to be promoting one of our products offline, however I do not want to use the original URL for this product page as it's long for the user to type in, so I thought it would be best practice in using a URL that would be short, easier for the consumer to remember. My plan: Replicate the product page and put it on this new short URL, however this would mean I have a duplicate content issue, would It be best practice to use a canonical on the new short URL pointing to the original URL? or use a 301? Thanks for any help
Intermediate & Advanced SEO | | Paul780 -
Changing URL Structure
We are going to be relaunching our website with a new URL structure. My question is, how is it best to deal with the migration process in terms of old URLS appearing whilst we launch the new ones. How best should we launch the new structure, considering we've in the region of 10,000 pages currently indexed in Google.
Intermediate & Advanced SEO | | NeilTompkins0 -
Spammy? Long URLs
Hi All: Is it true that URLs such as this following one are viewed as "spammy" (besides being too long) and that such URLs will negatively affect ranks for keywords and page ranks: http://www.repairsuniverse.com/ipod-parts-ipod-touch-replacement-repair-parts-ipod-touch-1st-gen-replacement-repair-parts.html My thinking is that the page will perform better once it is 301 redirected to a shorter page name, such as: http://www.repairsuniverse.com/ipod-touch-1G-replacement-parts.html It also appears that these long URLs are also more likely to break, creating unnecessary 404s. <colgroup><col width="301"></colgroup> Thanks for your insight on this issue!
Intermediate & Advanced SEO | | holdtheonion0