DeIndexing pagination
-
I have a custom made blog with boat loads of undesirable URLs in Google's index like this:
.com/resources?start=150
.com/resources?start=160
.com/resources?start=170I've identified this is a source of duplicate title tags and had my programmer put a no index tag to automatically go on all of these undesirable URLs like this:
However doing a site: search in google shows the URLs to still be indexed even though I've put the tag up a few weeks ago.
How do I get google to remove these URLs from the index? I'm aware that the Search Console has an answer here https://support.google.com/webmasters/topic/4598466?authuser=1&authuser=1&rd=1 but it says that blocking with meta tags should work.
Do I just get google to crawl the URL again so it sees the tag and then deindexes the URLs? Or is there another way I'm missing.
-
Adding a meta noindex tag can mean it takes a few weeks for a page to fall out of the index. These pages probably aren't doing you much harm, so if you wanted to just wait for them to fall out, that's probably fine (although I would update the tag content to "noindex, follow" to help Google crawl to the other noindexed pages). If you really want them out of the index faster, you could use the "Remove URLs" function under Google Index in Google Search Console, which will temporarily remove them from the index while Google is registering the noindex tags, or you can use the Fetch + Render tool and then Submit URLs in Google Search Console, which will cause Google to come back and crawl your pages and find the noindex tag.
-
You could use URL parameter settings in Google Search Console and Bing Webmaster Tools - if all ?start= URLs can be treated the same way by Google.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Massive local + national disconnect in rankings (local deindexed)
I asked the question originally on webmaster central. I tried RickRoll's solutions (but it doesn't seem to have solved the issue). Problem below: I've been noticing for some time that certain pages of our site (https://www.renthop.com/boston-ma/apartments-for-rent) have been deindexed locally (or very low ranked), but indexed nationally (well ranked). In fact, it seems that the actual page isn't ranking (but the blog https://www.renthop.com/blog is). This huge mismatch between national vs local rankings seem to only happen for Boston & Chicago. Other parts of the country seem unaffected (and the national & local rankings are very similar). A bit of a background (and my personal theory as to what's happening). We use to have subdomains: boston.renthop.com & chicago.renthop.com for the site. These subdomains stopped working, though, as we moved the site to the directory format (https://www.renthop.com/boston-ma/apartments-for-rent). These subdomain URLs were inactive / broken for roughly 4 months. After the 4 months, we did a 301 from the subdomain to the main page (because these subdomains had inbound external links). However, this seems to have caused the directory pages to exhibit the national/local mismatch effect instead of helping. Is there anything I'm doing wrong? I'm not sure if the mismatch is natural, if the pages are getting algo penalized on a local level (I'm negative SEOing myself), or if it's stuck in some weird state because of what happened with bad sub-domain move). Some things I've tried: I've created webmaster console (verified) accounts for both the subdomains. I've asked Google to crawl those links. I've done a 1-1 mapping between individual page on the old site vs the new directory format I've tried both doing a 301, 302 and meta-refresh redirect from the subdomains to the directory pages. I've made sure the robots.txt on the subdomain is working properly I've made sure that the robots.txt on the directory pages are working properly. See below for a screenshot of the mismatch & deindexing in local search results (this is using SERPS - but can be replicated with any location changer). Note the difference between the ranking (and the page) when the search is done nationally vs in the actual location (Boston, MA). I'd really appreciate any help.. I've been tearing my hair out trying to figure this out (as well as experimenting). renthop%2Bboston.png
Intermediate & Advanced SEO | | lzhou0 -
Homepage Deindexed?
Hi guys, One site we have been working on which is: https://tinyurl.com/j5m8vld Homepage seems to be deindexed by Google Australia. From initial perspective it seems it could be high velocity of links too soon, but not 100% sure as there are no messages in GWT. It could be duplicate content issues as we found duplication of the homepage e.g. https://tinyurl.com/zzds5mf which is the exact same as the homepage. However, don't think this would cause the homepage to be indexed which had strong rankings before it was deindexed (ranking for 100 keywords). Any suggestions would be very much appreciated! Cheers.
Intermediate & Advanced SEO | | jayoliverwright0 -
Javascript search results & Pagination for SEO
Hi On this page http://www.key.co.uk/en/key/workbenches we have javascript on the paginated pages to sort the results, the URL displayed and the URL linked to are different. e.g. The paginated pages link to for example: page2 http://www.key.co.uk/en/key/workbenches#productBeginIndex:30&orderBy:5&pageView:list& The list is then sorted by javascript. Then the arrows either side of pagination link to e.g. http://www.key.co.uk/en/key/workbenches?page=3 - this is where the rel/prev details are - done for SEO But when clicking on this arrow, the URL loaded is different again - http://www.key.co.uk/en/key/workbenches#productBeginIndex:60&orderBy:5&pageView:list& I did not set this up, but I am concerned that the URL http://www.key.co.uk/en/key/workbenches?page=3 never actually loads, but it's linked to Google can crawl it. Is this a problem? I am looking to implement a view all option. Thank you
Intermediate & Advanced SEO | | BeckyKey0 -
Pagination vs. Scroll for Ecommerce
Hi I wanted to see what opinions were on having a product listings on paginated pages vs. loading as the user scrolls? We use pagination but I have heard scroll may be better for SEO? Thanks!
Intermediate & Advanced SEO | | BeckyKey0 -
Our client's web property recently switched over to secure pages (https) however there non secure pages (http) are still being indexed in Google. Should we request in GWMT to have the non secure pages deindexed?
Our client recently switched over to https via new SSL. They have also implemented rel canonicals for most of their internal webpages (that point to the https). However many of their non secure webpages are still being indexed by Google. We have access to their GWMT for both the secure and non secure pages.
Intermediate & Advanced SEO | | RosemaryB
Should we just let Google figure out what to do with the non secure pages? We would like to setup 301 redirects from the old non secure pages to the new secure pages, but were not sure if this is going to happen. We thought about requesting in GWMT for Google to remove the non secure pages. However we felt this was pretty drastic. Any recommendations would be much appreciated.0 -
Pagination causing duplicate content problems
Hi The pagination on our website www.offonhols.com is causing duplicate content problems. Is the best solution adding add rel=”prev” / “next# to the hrefs As now the pagination links at the bottom of the page are just http://offonhols.com/default.aspx?dp=1
Intermediate & Advanced SEO | | offonhols
http://offonhols.com/default.aspx?dp=2
http://offonhols.com/default.aspx?dp=3
etc0 -
Pagination, Canonical, Prev & Next
Hello All
Intermediate & Advanced SEO | | Vitalized
I have a question about my Magento setup. I have lots of categories which have many products so the categories paginate. I've seen info about making sure the Canonical tag doesn't simply send Search Engines back to the first page meaning the paginated pages won't get indexed. I've also seen info about using the rel=next & rel=prev to help Search Engines understand the category pages are paginated... Is it okay to use both? I've made sure that: category/?p=1 has a canonical of category/ to make sure there isn't duplicate content. Here's an example of category/?p=2 meta data:
http://website.com/category/?p=2" />
http://website.com/category/" />
http://website.com/category/?p=3" />0 -
Best Practices for Pagination on E-commerce Site
One of my e-commerce clients has a script enabled on their category pages that allows more products to automatically be displayed as you scroll down. They use this instead of page 1, 2, and a view all. I'm trying to decide if I want to insist that they change back to the traditional method of multiple pages with a view all button, and then implement rel="next", rel="prev", etc. I think the current auto method is disorienting for the user, but I can't figure out if it's the same for the spiders. Does anyone have any experience with this, or thoughts? Thanks!
Intermediate & Advanced SEO | | smallbox0