Crawl Diagnostics Updates
-
I have several page types on my sites that I have blocked using the robots.txt file (ex: emailafriend.asp, shoppingcart.asp, login.asp), but they are still showing up in crawl diagnostics as issues (ex: duplicate page content, duplicate title tag, etc). Is there a way to filter these issues or perhaps there is something I'm doing wrong resulting in the issues that are showing up?
- Ryan
-
Hi Ryan,
try to move the sitemap to the end and leave a space before it. something like this:
User-agent:*
Disallow: /cgi-bin/
Disallow: /ShoppingCart.asp
Disallow: /SearchResults.asp...
...
Disallow: /mailinglist_subscribe.asp
Disallow: /mailinglist_unsubscribe.asp
Disallow: /EmailaFriend.asp -
I added the pages that it was suggesting to the robots.txt file:
http://www.naturalrugco.com/robots.txt
Most of the pages listed in the high priority errors within moz analytics crawl diagnostics are the emailafriend.asp pages which I've disallowed. Ex: http://www.naturalrugco.com/EmailaFriend.asp?ProductCode=AMB0012-parent
-
Hi Ryan,
At the end of this page you will find several ways to block Roger bot from indexing pages: http://moz.com/help/pro/rogerbot-crawler
I hope it helps,
Istvan
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Planning to update my Volusion site to HTTPS protocol. Concerns.
I'm planning to update my Volusion Commerce site to https protocol. A config variable switch will config urls to become secure https urls. I'm looking into what additional steps I must take and issues that I may run into. I guess any relative links within my site will not be affected but how about incoming backlinks. How do I address this. Also concerned about previous redirects. Do I have to create a new xml file incorporating the new protocol in the target and a version of both secure and insecure source urls. I figure a new sitemap has to be submitted to Google Search Console - should there be a secure https://www.example version as well as a https://example version of site and older similar http versions submitted. Thanks Howard
On-Page Optimization | | mrkingsley0 -
Updating Old Content at Scale - Any Danger from a Google Penalty/Spam Perspective?
We've read a lot about the power of updating old content (making it more relevant for today, finding other ways to add value to it) and republishing (Here I mean changing the publish date from the original publish date to today's date - not publishing on other sites). I'm wondering if there is any danger of doing this at scale (designating a few months out of the year where we don't publish brand-new content but instead focus on taking our old blog posts, updating them, and changing the publish date - ~15 posts/month). We have a huge archive of old posts we believe we can add value to and publish anew to benefit our community/organic traffic visitors. It seems like we could add a lot of value to readers by doing this, but I'm a little worried this might somehow be seen by Google as manipulative/spammy/something that could otherwise get us in trouble. Does anyone have experience doing this or have thoughts on whether this might somehow be dangerous to do? Thanks Moz community!
On-Page Optimization | | paulz9990 -
Should we rename and update a page or create a new page entirely?
Hi Moz Peoples! We have a small site with a simple site navigation, with only a few links on the nav bar. We have been doing some work to create a new page, which will eventually replace one of the links on the nav bar. The question we are having is, is it better to rename the existing page and replace its content and then wait for the great indexer to do its thing, or perm delete the page and replace it with the new page and content? Or is this a case where it really makes no difference as long as the redirects are set up correctly?
On-Page Optimization | | Parker8180 -
How often is your domain authority updated?
I can't seem to figure out how often our domain authority is updated - it seems random, do you know typically when this happens? Thanks!
On-Page Optimization | | regineraab0 -
Sitemaps Updating
Im using wordpress and I realise that my sitemaps doesnt update itself when i add an additional page on my website, like a blog post. I have to go to (1) setting > xml sitemap setting > click on build sitemap > save changes in wordpress, and then (2) Export the sitemal.xml file it to webmaster tools in google every single time i blog. Am i doing it wrong? i feel that all these should be automatic.
On-Page Optimization | | kevinbp0 -
How to fix Medium Priority Issues by mozpro crawled report??
How to resolve this issues crawled by mozpro?
On-Page Optimization | | renukishor
Some Medium priority issues like that: Missing Meta Description Tag: 2669
Title Element is Too Long: 523
Duplicate Page Title: 37 How to add missing meta description tag in these pages and how to short title element ?0 -
How updating a post can influence seo
I have just read this great post from moz blog: http://moz.com/blog/google-fresh-factor But I haven't found nothing about updating post date. If I edit an old great post (now ranked 2nd after several months in 1st serp position) does it be better if I also update the date of the post in the wordpress post edit page? Thank you very much and sorry for my poor english. Bye, Dario.
On-Page Optimization | | Italianseolover0 -
Dealing with updating blog posts
I run a travel and culture blog which means that I write about a lot of upcoming events which recur each year. Usually I title (and slug) the page with the event name and date. When it comes to update the article the next year, sometimes it's as little as changing the date, other times more has changed and it needs to be substantially re-written. Until now, what I've done is update the title, content, and then re-posted (sometimes altering the slug where it's needed to be done). Sometimes it works fine and Google keeps me ranking well, but other times the changes dont get such a great response. I have these options (as far as I can see). Which do you think is best? 1. To create a new article each year and put a message at the start of the previous one to say, click here to read about the 2012 event 2. To continue what I'm doing updating, changing the slug, and re-posting (ie changing the date). 3. To write a new article and insert a 301 redirect. I need to make sure the article appears as a new article in my RSS feed and also on the homepage. Look forward to your ideas! Thanks
On-Page Optimization | | ben10000