Removing pages from index
-
Hello,
I run an e-commerce website. I just realized that Google has "pagination" pages in the index which should not be there. In fact, I have no idea how they got there. For example, www.mydomain.com/category-name.asp?page=3434532
There are hundreds of these pages in the index. There are no links to these pages on the website, so I am assuming someone is trying to ruin my rankings by linking to the pages that do not exist.The page content displays category information with no products. I realize that its a flaw in design, and I am working on fixing it (301 none existent pages). Meanwhile, I am not sure if I should request removal of these pages. If so, what is the best way to request bulk removal.
Also, should I 301, 404 or 410 these pages?
Any help would be appreciated.
Thanks,
Alex
-
yes the no content page thing is a big problem. If you have a "view all" option, and it's more than a dozen, fifteen or maybe 20 products, that should be paginated, with full indexing. Maile Oyhe even talked about that specific scenario of "view all" being good.
In my experience, all of the no-content pages should, ideally, be 301 redirected in a way that they point to the most relevant highest level category page on your site.
Since there's so many, there's no easy way to get them removed from the index other than doing the 301 then being patient as Google recrawls then re-confirms.
-
Ah - that's definitely better, if you don't go too wide. 2009 - 2010's concept of not having too many links went too far with too many people. Sites became too flat.
Categories and pagination are best served with having enough categories to cover the highest level groups, with sub-categories as appropriate, but not to the the point where there's only a few products in any single sub-category. So if you've got more than a dozen or fifteen products in a category or sub-category, pagination is perfectly valid.
Having more than six, eight or maybe ten categories at most, is also not good.
-
Alan, I think I misspoke. I meant to say that a categorically structured set of your products would be better to index than a paginated version. For example :
http://www.sunglasses.com/mens/black/productx
as opposed to http://www.sunglasses.com/products?page=233
Is it still considered wise to index both paginated results along with categorized results in this case?
-
Hi Alan,
Thanks for the info. I was going to set my page 2+ to "noindex,follow", however your reply makes sense. I will leave them indexable. I do see some competitors "rel=canonical" pagination to "view all" pages. I think I will keep my pages as is.
However, as my reply to Ryan stated, my issue is still the INDEX.
Google has thousands of "no content" pages indexed. They contain links to other "no content" pages making my site look thin. This may be the reason we lost so much ranking/traffic with Panda update.
How do I get these pages removed from the index? And do I return 301, 404 or 410 when Google comes back to reindex them?
Thanks for your help!
Alex
-
Hi Ryan,
I crawled the site, and did not find links to these pages, however it made me realize another HUGE issue. Since the paging is dynamically created, it has links to the "back" & "forward" no matter what page you are on. So, if page # 5000 is displayed, it will have links to page # 4999 and 5001. Although in my website I do not have links to pages that do not exist, all it takes is someone link to my site with "page=10000" and Google to index that page. From this point on, G will index all the PAGEs that do not exist.
Thanks again for getting me a step closer to resolving my problem.
However, the problem is still the INDEX. Google has (now realizing that its in the thousands) pages indexed with no content. These pages just contain links to other PAGING pages that have no content and my main menu/categories.
How do I get these pages removed from the index?
Thanks again!
Alex
-
For the record, that link that SSCDavis referenced includes Matt Cutts discussing faceted navigation, not pagination. Faceted navigation is different than pagination by leaps and bounds. So he (SSCDavis), with all due respect, is absolutely incorrect in his claim of what Matt said.
Maile Ohye, Senior support engineer at Google, definitely recommends allowing pagination to be indexed, if implemented properly. She even discussed this at length this week up at SMX Advanced in Seattle. Vanessa Fox, head of Nine by Blue, and former Googler (the creator of Google Webmaster Tools) agrees.
And so do I.
When performed properly, pagination (with quality optimization of paginated pages) can lead to dramatic increases in individual products indexed, higher quality visits from people further along in the buying process, and more people finding the site through an exponentially greater number of keyword phrases.
Consider this - in pagination (X number of products on the initial page, with X additional DIFFERENT products on page 2, and x additional still more and different products on page 3,etc. - by not wanting those pages indexed, you're communicating to Google - hey - we don't care about these other products enough to include them." Which means they get a false and negative understanding of how many products you have in your catalog. And THAT drives the overall strength of your catalog down.
Now, if, on the other hand, you already show ALL of your products on a top level page that is linked from the main navigation, then sure, pagination should be killed. But only if that's the case.
-
Alex,
I would highly recommend crawling your website and examining the crawl report. If Google is indexing these pages, then they got to them on your site at some point. I would proceed with the idea in mind this is a web design issue, not someone trying to ruin your rankings, as you suggested.
The crawl report will show the referrer page which can help troubleshoot the issue. When you have pages generated by a CMS or other software, there can easily be issues like the one you are experiencing. In my experience this is the most likely cause of your issue.
You mentioned there are 100s of these pages in the index. If you can determine a pattern they match, it is possible you can 301 all of them with a single rule, sending the user to your main category page or where ever you feel is best.
You can also set up a parameter specific instructions in Google WMT. I would avoid doing this until after you have reviewed your crawl report. From your Google WMT dashboard > Site Configuration > Settings > Parameter handling tab > find or add your parameter and adjust the setting as you deem fit.
-
**Edit: Please see alans answer
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why does Google display the home page rather than a page which is better optimised to answer the query?
I have a page which (I believe) is well optimised for a specific keyword (URL, title tag, meta description, H1, etc). yet Google chooses to display the home page instead of the page more suited to the search query. Why is Google doing this and what can I do to stop it?
Intermediate & Advanced SEO | | muzzmoz0 -
Indexed Pages Different when I perform a "site:Google.com" site search - why?
My client has an ecommerce website with approx. 300,000 URLs (a lot of these are parameters blocked by the spiders thru meta robots tag). There are 9,000 "true" URLs being submitted to Google Search Console, Google says they are indexing 8,000 of them. Here's the weird part - When I do a "site:website" function search in Google, it says Google is indexing 2.2 million pages on the URL, but I am unable to view past page 14 of the SERPs. It just stops showing results and I don't even get a "the next results are duplicate results" message." What is happening? Why does Google say they are indexing 2.2 million URLs, but then won't show me more than 140 pages they are indexing? Thank you so much for your help, I tried looking for the answer and I know this is the best place to ask!
Intermediate & Advanced SEO | | accpar0 -
SEO: How to change page content + shift its original content to other page at the same time?
Hello, I want to replace the content of one page of our website (already indexeed) and shift its original content to another page. How can I do this without problems like penalizations etc? Current situation: Page A
Intermediate & Advanced SEO | | daimpa
URL: example.com/formula-1
Content: ContentPageA Desired situation: Page A
URL: example.com/formula-1
Content: NEW CONTENT! Page B
URL: example.com/formula-1-news
Content: ContentPageA (The content that was in Page A!) Content of the two pages will be about the same argument (& same keyword) but non-duplicate. The new content in page A is more optimized for search engines. How long will it take for the page to rank better?0 -
How can I prevent duplicate pages being indexed because of load balancer (hosting)?
The site that I am optimising has a problem with duplicate pages being indexed as a result of the load balancer (which is required and set up by the hosting company). The load balancer passes the site through to 2 different URLs: www.domain.com www2.domain.com Some how, Google have indexed 2 of the same URLs (which I was obviously hoping they wouldn't) - the first on www and the second on www2. The hosting is a mirror image of each other (www and www2), meaning I can't upload a robots.txt to the root of www2.domain.com disallowing all. Also, I can't add a canonical script into the website header of www2.domain.com pointing the individual URLs through to www.domain.com etc. Any suggestions as to how I can resolve this issue would be greatly appreciated!
Intermediate & Advanced SEO | | iam-sold0 -
Why does our business directions page rank above business profile page
Hi All, We are having an issue at the moment where our business direction page is ranking above the main business profile page. Our website is zodio.com, similar to Yelp but for South East Asia. An example of each page is below: Business Profile Page - http://www.zodio.com/business/detail/126037914/chowking Business Directions - http://www.zodio.com/business/direction/126037914 On many of our long tail searches for particular businesses, the business directions rank above the business details. Does anyone have any idea of why this would happen? I have researched Yelp and they do not have this issue. A few search examples in Google are as follows (one is in Thai): agonos dental clinic เวิลด์ชาร์มมิ่ง kawanku elektrik I have been rattling my brain and search for answers but cannot find anything. The communities help would be much appreciated. Many Thanks, Neil W
Intermediate & Advanced SEO | | zodiothailand0 -
New Web Page Not Indexed
Quick question with probably a straightforward answer... We created a new page on our site 4 days ago, it was in fact a mini-site page though I don't think that makes a difference... To date, the page is not indexed and when I use 'Fetch as Google' in WT I get a 'Not Found' fetch status... I have also used the'Submit URL' in WT which seemed to work ok... We have even resorted to 'pinging' using Pinglar and Ping-O-Matic though we have done this cautiously! I know social media is probably the answer but we have been trying to hold back on that tactic as the page relates to a product that hasn't quite launched yet and we do not want to cause any issues with the vendor! That said, I think we might have to look at sharing the page socially unless anyone has any other ideas? Many thanks Andy
Intermediate & Advanced SEO | | TomKing0 -
Redirecting thin content city pages to the state page, 404s or 301s?
I have a large number of thin content city-level pages (possibly 20,000+) that I recently removed from a site. Currently, I have it set up to send a 404 header when any of these removed city-level pages are accessed. But I'm not sending the visitor (or search engine) to a site-wide 404 page. Instead, I'm using PHP to redirect the visitor to the corresponding state-level page for that removed city-level page. Something like: if (this city page should be removed) { header("HTTP/1.0 404 Not Found");
Intermediate & Advanced SEO | | rriot
header("Location:http://example.com/state-level-page")
exit();
} Is it problematic to send a 404 header and still redirect to a category-level page like this? By doing this, I'm sending any visitors to removed pages to the next most relevant page. Does it make more sense to 301 all the removed city-level pages to the state-level page? Also, these removed city-level pages collectively have very little to none inbound links from other sites. I suspect that any inbound links to these removed pages are from low quality scraper-type sites anyway. Thanks in advance!2 -
How to remove duplicate content, which is still indexed, but not linked to anymore?
Dear community A bug in the tool, which we use to create search-engine-friendly URLs (sh404sef) changed our whole URL-structure overnight, and we only noticed after Google already indexed the page. Now, we have a massive duplicate content issue, causing a harsh drop in rankings. Webmaster Tools shows over 1,000 duplicate title tags, so I don't think, Google understands what is going on. <code>Right URL: abc.com/price/sharp-ah-l13-12000-btu.html Wrong URL: abc.com/item/sharp-l-series-ahl13-12000-btu.html (created by mistake)</code> After that, we ... Changed back all URLs to the "Right URLs" Set up a 301-redirect for all "Wrong URLs" a few days later Now, still a massive amount of pages is in the index twice. As we do not link internally to the "Wrong URLs" anymore, I am not sure, if Google will re-crawl them very soon. What can we do to solve this issue and tell Google, that all the "Wrong URLs" now redirect to the "Right URLs"? Best, David
Intermediate & Advanced SEO | | rmvw0