Login webpage blocked by robots
-
Hi, the SEOMOZ crawl diagnostics shows that this page:
www.tarifakitesurfcamp.com/wp-login.php is blocked (noindex, nofollow)
Is there any problem with that?
-
thanks!
-
thanks!
-
Unless you have relevant information for your users on the log in page (i.e. for your private use) then it's probably a good idea not to index it!
-
Nope, that's perfectly fine since that's your login page for Wordpress.
If you're linking to the page from anywhere on your site (which you really shouldn't be), you could update the meta robots tag to (noindex, FOLLOW), but since it looks like the page has no links, it shouldn't be necessary.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help recover lost traffic (70%) from robots.txt error.
Our site is a company information site with 15 million indexed pages (mostly company profiles). Recently we had an issue with a server that we replaced, and in the processes mistakenly copied the robots.txt block from the staging server to a live server. By the time we realized the error, we lost 2/3 of our indexed pages and a comparable amount of traffic. Apparently this error took place on 4/7/19, and was corrected two weeks later. We have submitted new sitemaps to Google and asked them to validate the fix approximately a week ago. Given the close to 10 million pages that need to be validated, so far we have not seen any meaningful change. Will we ever get this traffic back? How long will it take? Any assistance will be greatly appreciated. On another note, these indexed pages were never migrated to SSL for fear of losing traffic. If we have already lost the traffic and/or if it is going to take a long time to recover, should we migrate these pages to SSL? Thanks,
On-Page Optimization | | akin671 -
Two Robots.txt files
Hi there Can somebody please help me that one of my client site have two robot.txt files (please see below). One txt file is blocked few folders and another one is blocked completely all the Search engines. Our tech team telling that due to some technical reasons they using second one which placed in inside the server and search engines unable to see this file. www.example.co.uk/robots.txt - Blocked few folderswww.example.co.uk/Robots.txt - Blocked all Search Engines I hope someone can give me the help I need in this one. Thanks in advance! Cheers,
On-Page Optimization | | TrulyTravel
Satla0 -
Can I robots.txt an entire site to get rid of Duplicate content?
I am in the process of implementing Zendesk and will have two separate Zendesk sites with the same content to serve two separate user groups (for the same product-- B2B and B2C). Zendesk does not allow me the option to changed canonicals (nor meta tags). If I robots.txt one of the Zendesk sites, will that cover me for duplicate content with Google? Is that a good option? Is there a better option. I will also have to change some of the canonicals on my site (mysite.com) to use the zendesk canonicals (zendesk.mysite.com) to avoid duplicate content. Will I lose ranking by changing the established page canonicals on my site go to the new subdomain (only option offered through Zendesk)? Thank you.
On-Page Optimization | | RoxBrock0 -
Need suggestion: Should the user profile link be disallowed in robots.txt
I maintain a myBB based forum here. The user profile links look something like this http://www.learnqtp.com/forums/User-Ankur Now in my GWT, I can see many 404 errors for user profile links. This is primarily because we have tight control over spam and auto-profiles generated by bots. Either our moderators or our spam control software delete such spammy member profiles on a periodic basis but by then Google indexes those profiles. I am wondering, would it be a good idea to disallow User profiles links using robots.txt? Something like Disallow: /forums/User-*
On-Page Optimization | | AnkurJ0 -
I have more pages in my site map being blocked by the robot file than I have being allowed to be crawled. Is Google going to hate me for this?
Using some rules to block all pages which start with "copy-of" on my website because people have a bad habit of duplicating new product listings to create our refurbished, surplus etc. listings for those products. To avoid Google seeing these as duplicate pages I've blocked them in the robot file, but of course they are still automatically generated in our sitemap. How bad is this?
On-Page Optimization | | absoauto0 -
One Webpage per Topic or splitting up for better reading...?
What is better from a SEO-Point of View: I am building right now a website where the principal topic is Renewable Energies. There will be a menu listing all kinds of Energy-types: Biogas CSP Biomass etc. And now my question: Each Topic has about 800-1000 Words of unique content with sub-topics. I think its certainly good to have for each energy type one separate page. But I think its no a good Idea to split also the subtopics up to further sub-pages like: www.energy.com/renewable-energies-biomass.html www.energy.com/renewable-energies-biomass-eficiency.html www.energy.com/renewable-energies-biomass-market.html www.energy.com/renewable-energies-biomass-industries.html as 1000 Words on one page may look like better higher quality content than making 3-4 pages with just 200 Words... talking about Biomass, but from several points of views. So I think its better to put all about Biomass on one single-page and use a menu just to jump to the subtopics via anchor-tags. Right? 🙂 Thanks Kate and Charles! Meanwhile I found out whats the right term for my question: "Pagination" I read about using the rel="next" and rel="prev" attribute when paginating an article over different pages.
On-Page Optimization | | inlinear
MY DOUBT: Sometimes you see single page paginated by using javaScript that hides text although all is in the page source, for better reading. Does Google like that or might think it could be hidden text with spamming purpose? So I think using old school "named anchors" to divide text into topics (for text about 1000 words) is better than using javaScript that reaveals text via pagination or expand collapse.0 -
Using meta robots 'noindex'
Alright, so I would consider myself a beginner at SEO. I've been doing merchandising and marketing for Ecommerce sites for about a year and a half now and am just now starting to attempt to apply some intermediate SEO techniques to the sites I work on so bear with me. We are currently redoing the homepage of our site and I am evaluating what links to have on it. I don't want to lose precious link juice to pages that don't need it, but there are certain pages that we need to have on the homepage that people just won't search for. My question is would it be a good move to add the meta robots 'noindex' tag to these pages? Is my understanding correct that if the only link on the page is back to the homepage it will pass back the linkjuice? Also, how many homepage links are too many? We have a fairly large ecommerce site with a lot of categories we'd like to feature, but don't want to overdo the homepage. I appreciate any help!
On-Page Optimization | | ClaytonKendall0 -
What is the best way to make use of internal anchor text links without appearing to be a 'spammy' webpage?
I've recently been spending some time going through all the content on our website, henstuff.com, adding internal anchor text links to product copy with the link following back to the product's generic catagory. I've been focusing on the search term 'hen party accessories', but have also been using 'hen do accessories' and 'hen night accessories'. I know that internal linking has value when it comes to SEO and rankings, but was keen to find roughly at what point usage of a certain search term for anchor links is seen as spam by the engines. Is there a certain formula to follow when it comes to internal anchor text links? You can see some examples at: http://www.henstuff.com/hen-night-accessories/hen-party-accessories/willy-bubbles http://www.henstuff.com/hen-night-accessories/hen-party-devil-horns/hen-night-pink-devil-horns Many thanks Oli
On-Page Optimization | | RobertHill1