Googlebot indexing URL's with ? queries in them. Is this Panda duplicate content?
-
I feel like I'm being damaged by Panda because of duplicate content as I have seen the Googlebot on my site indexing hundreds of URL's with ?fsdgsgs strings after the .html. They were beign generated by an add-on filtering module on my store, which I have since turned off. Googlebot is still indexing them hours later. At a loss what to do. Since Panda, I have lost a couple of dozen #1 rankings that I've held for months on end and had one drop over 100 positions.
-
Thanks for all that. Really valuable information. I have gone to Parameter handing and there were 54 parameters listed. In total, generating over 20 million unnecessary URLs. I nearly died when I saw it. We have 6,000 genuine pages and 20 million shitty ones that don't need to be indexed. Thankfully, I'm upgrading next week and I have turned the feature off on the current site, the new one won't have that feature. Phew.
I have changed the settings for these parameters that were already listed in Webmaster tools, and now I wait for the biggest re-index in history LOL!
I have submitted a sitemap now and as I rewrite page titles & meta descriptions, I'm using the Fetch as Google tool to ask for resubmission. It's been a really valuable lesson, and I'm just thankful that I wasn't hit worse than I was. Now, it's a waiting game.
Of my 6,000 URLs' on the site map submitted a couple of days ago, around 1/3 of them have been indexed. When I first uploaded it, only 126 of them were.
-
The guys here are all correct - you can handle these in WMT with parameter handling, but as every piece of text about parameter handling states, handle with care. You can end up messing things up big-time if you block areas of the site you do want crawled.
You'll also have to wait days / longer for Google to acknowledge the changes and reflect these in its index and in WMT.
If it's an option, look at using the canonical tag to self-reference: this means that if the CMS creates multiple pages with the same file on different URLs, they'll all point back to the original URL.
-
"They were beign generated by an add-on filtering module on my store, which I have since turned off. Googlebot is still indexing them hours later."
Google will continue to index them, until you tell them specifically not to do so. Go to GWT, and resubmit a sitemap containing only the URL's you want them to index. Additionally, do a "fetch as Google" on the same pages as your sitemap. This can help to speed up the "reindex" process.
Also, hours? LMAO it will take longer than that. Unless you are a huge site that gets crawled hourly, it can take days, if not weeks for those URL's to disappear. I'm thinking longer since it does not sound like you have redirected those links, just turned off the plugin that was used to create them. Depending on how your store is set up, and how many pages you have, it may be wise to 301 all the offending pages to their proper destination URL.
-
Check out parameter exclusion options in Webmaster Tools. You can tell the search engines to ignore these appended parameters.
-
Use a spidering tool to check out all of the links from your site, such as Screaming Frog.
Also check your XML & HTML Site Maps doesn't have old links.
Hope this helps
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate URL errors when URL's are unique
Hi All, I'm running through MOZ analytics site crawl report and it is showing numerous duplicate URL errors, but the URLs appear to be unique. I see that the majority of the URL's are the same, but shouldn't the different brands make them unique to one another? http://www.sierratradingpost.com/clearance~1/clothing~d~5/tech-couture~b~33328/ http://www.sierratradingpost.com/clearance~1/clothing~d~5/zobha~b~3072/ Any ideas as to why these would be shown as duplicate URL errors?
On-Page Optimization | | STP_SEO0 -
Ecommerce product page duplicate content
Hi, I know this topic has been covered in the past but I haven't been able to find the answers to this specific thing. So let's say on a website, all the product pages contain partial duplicate content - i.e. this could be delivery options or returning policy etc. Would this be classed as duplicate content? Or is this something that you would not get concerned about if it's let's say 5-10% of the content on the page? Or if you think this is something you'd take into consideration, how would you fix it? Thank you!
On-Page Optimization | | MH-UK0 -
PDF Instructions come up in Crawl report as Duplicate Content
Hello, My ecommerce site has many PDF instruction pages that are being marked as duplicate content in the site crawl. Each page has a different title, and then a PDF displayed in an iframe with a link back to the previous page & to the category that the product is placed in. Should I add text to the pages to help differentiate them? I included a screenshot of the code that is on all the pages. Thanks! Justin 9tD9HMr
On-Page Optimization | | JustinBSLW0 -
Duplicate Content aka 301 redirect from .com to .com/index.html
Moz reports are telling me that I have duplicate content on the home page because .com and .com/index.html are being seen as two pages. I have implemented 301 redirect using various codes I found online, but nothing seems to work. Currently I'm using this code. RewriteEngine On
On-Page Optimization | | omakad
RewriteBase /
RewriteCond %{HTTP_HOST} ^jacksonvilleacservice.com
RewriteRule ^index.html$ http://www.jacksonvilleacservice.com/ [L,R=301] Nothing is changing. What am I doing wrong? I have given it several weeks but report stays the same. Also according to webmasters tools they can't see this as duplicate content. What am I doing wrong?0 -
When You Add a Robots.txt file to a website to block certain URLs, do they disappear from Google's index?
I have seen several websites recently that have have far too many webpages indexed by Google, because for each blog post they publish, Google might index the following: www.mywebsite.com/blog/title-of-post www.mywebsite.com/blog/tag/tag1 www.mywebsite.com/blog/tag/tag2 www.mywebsite.com/blog/category/categoryA etc My question is: if you add a robots.txt file that tells Google NOT to index pages in the "tag" and "category" folder, does that mean that the previously indexed pages will eventually disappear from Google's index? Or does it just mean that newly created pages won't get added to the index? Or does it mean nothing at all? thanks for any insight!
On-Page Optimization | | williammarlow0 -
Should I worry about duplicate titles on pages where there is paginated content?
LivingThere.com is a real estate search site and many of our content pages are "search result" - ish in that a page often provides all the listings that are available and this may go on for multiple pages. For example, this is a primary page about a building: http://livingthere.com/building/31308-Cocoa-Exchange Because of the number of listings, the listings paginate to a second page: http://livingthere.com/building/31308-Cocoa-Exchange?MListings_page=2 Both pages have the same Page Title. Is this a concern? If so is there a "best practice" for giving paginated content different titles? Thanks! Nate
On-Page Optimization | | nate1230 -
What is the best duplicate content checker that will check by phrase?
I create a lot of landing pages for individual keywords on my site. An issue that I've run into is I unknowingly use some common phrases repeatedly on different pages and therefore sometimes get dinged by Google. I'm basically looking for a tool that would check the content of a new page against all the other pages on my site and check it phrase by phrase. Most of the tools I've found make you put in two URLs to check against - I need it to check against hundreds.
On-Page Optimization | | davegr0 -
Is it good to have dashes in url's
When using keywords in url's for internal pages, isn't it a good idea to use dashes or underscores in the url between the keywords?
On-Page Optimization | | BradBorst0