SSL and robots.txt question - confused by Google guidelines
-
I noticed "Don’t block your HTTPS site from crawling using robots.txt" here: http://googlewebmastercentral.blogspot.co.uk/2014/08/https-as-ranking-signal.html
Does this mean you can't use robots.txt anywhere on the site - even parts of a site you want to noindex, for example?
-
Hi Luke,
Just make sure that your robots.txt file located at https://www.example.com/robots.txt doesn't block search engine spiders. Of course there may be some folders or filetypes you want to block but it certainly shouldn't look like below which would block everything:
User-agent: *
Disallow: /
Hope that helps
-
No that's not what they mean - it means Google recommends you allow the secure version of your site(where applicable) to be crawled. You can still block certain pages/sections should you choose to do so.
With regards to noindexing you could also place this on the actual page as an alternative.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to be included in related questions (People also ask) in Google SERP?
Just wondered if anyone knows how to be included in SERP if it comes to related questions (People also ask). Do you have to fill some requirements or is it featured snippets kind of thing.
Intermediate & Advanced SEO | | Optimal_Strategies1 -
Breadcrumbs not displaying on Google
Hello, We have set breadcrumbs on some of our pages (example: https://www.globecar.com/en/car-rental/locations/canada/qc/montreal/airport-yul) for testing purposes and for some reasons they are still not showing up on Google: http://screencast.com/t/BSHQqkP69r6F Yet when I test the page with Google Structured Data Testing tool all is good: http://screencast.com/t/Fzlz3zae Any ideas? Thanks, Karim
Intermediate & Advanced SEO | | GlobeCar0 -
Application & understanding of robots.txt
Hello Moz World! I have been reading up on robots.txt files, and I understand the basics. I am looking for a deeper understanding on when to deploy particular tags, and when a page should be disallowed because it will affect SEO. I have been working with a software company who has a News & Events page which I don't think should be indexed. It changes every week, and is only relevant to potential customers who want to book a demo or attend an event, not so much search engines. My initial thinking was that I should use noindex/follow tag on that page. So, the pages would not be indexed, but all the links will be crawled. I decided to look at some of our competitors robots.txt files. Smartbear (https://smartbear.com/robots.txt), b2wsoftware (http://www.b2wsoftware.com/robots.txt) & labtech (http://www.labtechsoftware.com/robots.txt). I am still confused on what type of tags I should use, and how to gauge which set of tags is best for certain pages. I figured a static page is pretty much always good to index and follow, as long as it's public. And, I should always include a sitemap file. But, What about a dynamic page? What about pages that are out of date? Will this help with soft 404s? This is a long one, but I appreciate all of the expert insight. Thanks ahead of time for all of the awesome responses. Best Regards, Will H.
Intermediate & Advanced SEO | | MarketingChimp100 -
Fetch as Google - Redirected
Hi I have swaped from HTTP to HTTPS and put a redirect on for HTTP to redirect to HTTPS. I also put www.xyz.co.uk/index.html to redirect to www.xyz.co.uk When I fetch as Google it shows up redirect! Does this mean that I have too many 301 looping? Do I need the redirect on index.html to root domain if I have a rel conanical in place for index.html htaccess (Linix) - RewriteCond %{HTTP_HOST} ^xyz.co.uk
Intermediate & Advanced SEO | | Cocoonfxmedia
RewriteRule (.*) https://www.xyz.co.uk/$1 [R=301,L] RewriteRule ^$ index.html [R=301,L]0 -
Blocking out specific URLs with robots.txt
I've been trying to block out a few URLs using robots.txt, but I can't seem to get the specific one I'm trying to block. Here is an example. I'm trying to block something.com/cats but not block something.com/cats-and-dogs It seems if it setup my robots.txt as so.. Disallow: /cats It's blocking both urls. When I crawl the site with screaming flog, that Disallow is causing both urls to be blocked. How can I set up my robots.txt to specifically block /cats? I thought it was by doing it the way I was, but that doesn't seem to solve it. Any help is much appreciated, thanks in advance.
Intermediate & Advanced SEO | | Whebb0 -
Duplicate Content Question
My understanding of duplicate content is that if two pages are identical, Google selects one for it's results... I have a client that is literally sharing content real-time with a partner...the page content is identical for both sites, and if you update one page, teh otehr is updated automatically. Obviously this is a clear cut case for canonical link tags, but I'm cuious about something: Both sites seem to show up in search results but for different keywords...I would think one domain would simply win out over the other, but Google seems to show both sites in results. Any idea why? Also, could this duplicate content issue be hurting visibility for both sites? In other words, can I expect a boost in rankings with the canonical tags in place? Or will rankings remain the same?
Intermediate & Advanced SEO | | AmyLB0 -
Google places
Is there away to get to the top of google places? Can it be manipulated?
Intermediate & Advanced SEO | | dynamic080 -
Is Google mad at me for redirecting...?
Hi, I have an e-commerce website that sells unique items (one of a kind). We have hundreds of items and the items are rapidly sold. Up till now I kept the sold items under our "sold items" section but it started to get back at me as we have more "sold" than non sold and we are having duplication problems (the items are quite similar besides to sizes etc.). What should we do? Should we redirect 100 pages each week? Will Google be upset with that? (for driving it crazy) Thanks
Intermediate & Advanced SEO | | BeytzNet0