Is Googlebot ignoring directives? Or is it Me?
-
I saw an answer to a question in this forum a few days ago, that said it was a bad idea to use robots.txt to tell googlebot to go away.
That SEO said it was much better to use the META tag to say noindex,nofollow.
So I removed the robots directive and added the META tag
<meta robots='noindex,nofollow'>
Today, I see google showing my send to a friend page where I expected the real page to be.
Does it mean Google is stupid?
Does it mean google ignores the Robots META tag?
Does it mean short pages have more value than long pages?
Does it mean if I convert my whole site to snippets, I'll get more traffic?
Does it mean garbage trumps content?
I have more questions, but this is more than enough.
-
Thank you Ryan.
They completely ignored the meta tags., completely messing up our serps. So I put it back in robots. I wont trust google again to do the right thing.
-
Hi Allan,
It is a best practice to use meta tags to indicate your indexing preference to search engines.
Normally the recommended implementation would be "noindex, follow" but without examining your site it is impossible to know for sure.
Google honors meta tags but there are a number of issues which could be the source of your issue. For example, if you did not use valid syntax the tag may not be honored. If you are blocking the page in robots.txt, then search engines cannot read the tag.
As for the last three questions, the simple answer is quality content is best.
If you can share the URL of the page involved, we can offer a specific response to the implementation of the meta tag.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL 301 Re-direct
Hello, If we publish a blog post with a url which accidentally contains a number at the end (blog.companyname.com/subject-title-0), is it best-practice to update the URL (e.g. to blog.companyname.com/subject-title) and put in a 301 re-direct from the old to the new one or should it simply be left as is? I've read that 301's lose link equity and relevance so is it really worth re-directing for the sake of a cleaner url? Thanks for your input! John
Technical SEO | | SEOCT1 -
Is it a good idea to direct a new url to a subfolder?
Hello everyone, I have a client who is in public relations. He is wanting to integrate SEO into his business and is hiring me to help. He purchased a domain that is separate from his existing website, but wants to keep the services and existing blog (which will include SEO) on his current site. For example, he has: www.example.com and wants to add a SEO folder which will contain our services: www.example.com/prseo the domain he purchased (as another example) is: www.prseo.com but wants to direct it to the folder he has set up @ example.com/prseo Can anyone offer advise? Is this a good idea?
Technical SEO | | visabelmedia0 -
301 Re-direct help
Hello Mozzers, I have a technical question that perhaps someone has experience with and can help with. I currently have 2 e-commerce websites: SITE-A.COM (original site) & SITE-B.COM (new site) SITE-B.COM is the newer site that has a lot of new products and new features and great content and is very user friendly. We are thinking about funneling all of our visitors and traffic to SITE-B.com since it is the better experience for the users ... the question is this: If we want to 301 redirect all traffic from Site-A.com to Site-B.com ... where do we initiate those redirect requests? Would it be on the server for Site-A.com? If so, would i have to keep that server up and running forever if i don't want to lose the re-directs? Also, how do i do this properly without violating Google's guidelines? Any help is appreciated. Thanks
Technical SEO | | Prime850 -
Case study re-directing one site into another?
Hi there, We have access to a third party site that has high domain authority and we want to know if anyone has a case study demonstrating what happens when you 301-redirect a high DA site to another high DA site. In particular, we are wondering what kind of lift the site saw from the additional link equity?
Technical SEO | | nicole.healthline0 -
Weird Blog tags and re-directs
Hello fellow Digital Marketeers! As an in-house kinda guy, I rarely get to audit sites other than my own. But, I was tasked with auditing another. So I ran it through Screaming Frog and the usual tools. I got a couple of URLs come back with timeout messages, so I checked them manually- they're apparently part of a blog's archive: http://www.bestpracticegroup.com/tag/training-2/ I click 'read more' and it takes you to: http://www.bestpracticegroup.com/pfi-contracts-3-myth-busters-to-help-achieve-savings/ The first URL seems entirely redundant. Has anyone else seen something like this? Just an explanation as to why something like that would exist, and how you'd handle that would be grand! Much appreciated, John.
Technical SEO | | Muhammad-Isap0 -
NOFOLLOW Links: Can we 100% ignore them for SEO purposes?
Some SEO articles say we can completely ignore NoFollow links. Other articles say they still matter - but then are very vague on what they count for or against. So which is it really? I do realize that they can provide traffic, and for that they are worthwhile. But it is SEO I am asking about... The SEO purpose I am most concerned with is the Link Profile. Separating the Follows from the NoFollows often gives really different anchor text distributions. If they don't matter, why do MOZ and other SEO Analysis programs still include them in their standard reports? (I can see some benefit to having them as part of the in-depth reports) So what's your thoughts? Can we 100% ignore the NoFollows for our SEO analysis?
Technical SEO | | GregB1230 -
Temporarily suspend Googlebot without blocking users
We'll soon be launching a redesign, on a new platform, migrating millions of pages to new URLs. How can I tell Google (and other crawlers) to temporarily (a day or two) ignore my site? We're hoping to buy ourselves a small bit of time to verify redirects and live functionality before allowing Google to crawl and index the new architecture. GWT's recommendation is to 503 all pages - including robots.txt, but that also makes the site invisible to real site visitors, resulting in significant business loss. Bad answer. I've heard some recommendations to disallow all user agents in robots.txt. Any answer that puts the millions of pages we already have indexed at risk is also a bad answer. Thanks
Technical SEO | | lzhao0 -
Is it possibly to use anything besides a 302 re-direct when your doing a re-direct for someone to login?
Hopefully this makes sense. So I am working on a site that uses a 302 re-direct for logins. As in it goes from a profile page to the login via a re-direct, most of the time I see sites use this as a meta refresh, but in this case I wasn't sure. Obviously when I run a crawl diagnostic I'm getting a lot of errors as in over 100. Now I know there is no link juice with this, but I was just wondering what other people thought on using 302's for logins? Thanks
Technical SEO | | kateG12980