Summarize your question.Sitemap blocking or not blocking that is the question?
-
Hi from wet & overcast wetherby UK
Ones question is this...
" Is the sitemap plus boxes blocking bots ie they cant pass on this page http://www.langleys.com/Site-Map.aspx "
Its just the + boxes that concern me, i remeber reading somewherte javascript nav can be toxic.
Is there a way to test javascript nav set ups and see if they block bots or not?
Thanks in advance
-
I use Screaming Frog SEO Spider (free version) to check the internal link structure of a website. If a page is blocking ALL spiders it will pick it up.
Another thing I would say would be to check in Google Webmaster Tools to see if there are any crawl errors.
And the last thing I would add is to make sure that you have a non-JavaScript way to find all the pages on your website - through strong internal linking or a manual sitemap page that isn't generated through JS.
Hope this helps
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content question
Hey Mozzers! I received a duplicate content notice from my Cycle7 Communications campaign today. I understand the concept of duplicate content, but none of the suggested fixes quite seems to fit. I have four pages with HubSpot forms embedded in them. (Only two of these pages have showed up so far in my campaign.) Each page contains a title (Content Marketing Consultation, Copywriting Consultation, etc), plus an embedded HubSpot form. The forms are all outwardly identical, but I use a separate form for each service that I offer. I’m not sure how to respond to this crawl issue: Using a 301 redirect doesn’t seem right, because each page/form combo is independent and serves a separate purpose. Using a rel=canonical link doesn’t seem right for the same reason that a 301 redirect doesn’t seem right. Using the Google Search Console URL Parameters tool is clearly contraindicated by Google’s documentation (I don’t have enough pages on my site). Is a meta robots noindex the best way to deal with duplicate content in this case? Thanks in advance for your help. AK
Technical SEO | | AndyKubrin0 -
General Question: Linking Root Domains 0
Hi, we have several subpages with PA 1. We try to figure out why. The link metrics show 0 Linking root domains, but shouldnt there be at least the own domain like 1 linking root domain or are we getting it already wrong here? The subpages have following link metrics: 0 External followed Links. 8416 subdomain and 8421 Root Domain, Linking Root Domains 0, 299 Subdomain, 302 Root Domain (From Mozbar). The pages seem to be crawled. We are suspecting techincal reasons. What would be the impact of the linking root domain to the PA 1? Thank you in advance
Technical SEO | | brainfruit0 -
Sitemap issue
How can I create XML as well as HTML sitemaps for my website (both eCommerce and non - eCommerce )Is there any script or tool that helps me making perfect sitemapPlease suggest
Technical SEO | | Obbserv0 -
Some URLs in the sitemap not indexed
Our company site has hundreds of thousands of pages. Yet no matter how big or small the total page count, I have found that the "URLs Indexed" in GWMT has never matched "URLS in Sitemap". When we were small and now that we have a LOT more pages, there is always a discrepancy of ~10% or so missing from the index. It's difficult to know which pages are not indexed, but I have found some that I can verify are in the Sitemap.xml file but not at all in the index. When I go to GWMT I can "Fetch and Render" missing pages fine - it's not as though it's blocked or inaccessible. Any ideas on why this is? Is this type of discrepancy typical?
Technical SEO | | Mase0 -
Robots.txt and Multiple Sitemaps
Hello, I have a hopefully simple question but I wanted to ask to get a "second opinion" on what to do in this situation. I am working on a clients robots.txt and we have multiple sitemaps. Using yoast I have my sitemap_index.xml and I also have a sitemap-image.xml I do put them in google and bing by hand but wanted to have it added into the robots.txt for insurance. So my question is, when having multiple sitemaps called out on a robots.txt file does it matter if one is before the other? From my reading it looks like you can have multiple sitemaps called out, but I wasn't sure the best practice when writing it up in the file. Example: User-agent: * Disallow: Disallow: /cgi-bin/ Disallow: /wp-admin/ Disallow: /wp-content/plugins/ Sitemap: http://sitename.com/sitemap_index.xml Sitemap: http://sitename.com/sitemap-image.xml Thanks a ton for the feedback, I really appreciate it! :) J
Technical SEO | | allstatetransmission0 -
Should all pagination pages be included in sitemaps
How important is it for a sitemap to include all individual urls for the paginated content. Assuming the rel next and prev tags are set up would it be ok to just have the page 1 in the sitemap ?
Technical SEO | | Saijo.George0 -
Meta tags question - imagetoolbar
We inherited some sites from another vendor & they have these tags in the head of all pages. Are they of any value at all? Thanks for the help! Wick Smith
Technical SEO | | wcksmith0 -
Sitemap with References to Second Domain
I have just discovered a client site that is serving content from a single database into two separate domains and has created xml sitemaps which contain references to both domains in an attempt to avoid being tagged for duplicate content. I always thought that a sitemap was intended to show the files inside a single domain and the idea of multiple domains in the sitemap had never occurred to me... The sites are both very large storefronts and one of them (the larger of the two) has recently seen a 50% drop in search traffic and loss of some 600 search terms from top 50 positions in Google. My first instinct is that the sitemaps should be altered to only show files within each domain, but am worried about causing further loss of traffic. Is it possible that the inclusion URLs for the second domain in the sitemap may in fact be signalling duplicate content to Search Engines? Does anyone have a definitive view of whether these sitemaps are good, bad or irrelevant?
Technical SEO | | ShaMenz0