Login webpage blocked by robots
-
Hi, the SEOMOZ crawl diagnostics shows that this page:
www.tarifakitesurfcamp.com/wp-login.php is blocked (noindex, nofollow)
Is there any problem with that?
-
thanks!
-
thanks!
-
Unless you have relevant information for your users on the log in page (i.e. for your private use) then it's probably a good idea not to index it!
-
Nope, that's perfectly fine since that's your login page for Wordpress.
If you're linking to the page from anywhere on your site (which you really shouldn't be), you could update the meta robots tag to (noindex, FOLLOW), but since it looks like the page has no links, it shouldn't be necessary.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Meta Robots information
Hi, I have a question about the Meta Robots information Accoarding to the Moz bar, our page uses the meta robots noodp and noydir. Our competitor uses
On-Page Optimization | | AdoBike
INDEX,FOLLOW I read that noodp and noydir are dated and not used anymore. Is it wise to use INDEX FOLLOW instead for better SEO? Thanks in advance!1 -
Is it urgent to have fewer than 100 internal links on a webpage?
Hi, Our website is set up so that our top menu is on every page, which means every page is going to have around the same amount of internal links (225-ish). Is this an issue that needs to be fixed for our pages to rank, or is it only a recommendation that doesn't really impact SEO that much? If it is the only issue listed for a particular page, is there another reason that page might not be ranking even though it has a 99 score? Or is because of having 225 internal links? I have many product pages on my website that have a 99 score on the Page Optimization with the only recommendation being an info that says not to have too many internal links. My understanding is that internal links are defined as any URL on a page pointing to another part of the same root domain/site. So, for example, my page: https://www.twowayradiosfor.com/Motorola-CP185-p/cp185-lkp.htm has 225 internal links in the source code for that page: Where do I go to fix this issue if I need to get to below 100 internal links? Do I erase the links, or set up a no-follow tag? I appreciate any help or guidance. Thank you! Austin
On-Page Optimization | | AllChargedUp2 -
I have more pages in my site map being blocked by the robot file than I have being allowed to be crawled. Is Google going to hate me for this?
Using some rules to block all pages which start with "copy-of" on my website because people have a bad habit of duplicating new product listings to create our refurbished, surplus etc. listings for those products. To avoid Google seeing these as duplicate pages I've blocked them in the robot file, but of course they are still automatically generated in our sitemap. How bad is this?
On-Page Optimization | | absoauto0 -
70 Domain Names Point to 70 Nearly Identical Inner Webpages
I have a new SEO client; his website has never been optimized. There are 70 domain names involved with this one website. Each domain name points to an exact replica of the main page, other than the fact that a small content box has different info in it, and sometimes the header graphic is different. So, 70 webpages that are 5% different from each other and 5% different from the main page. How badly is this issue affecting this website's ability to rank well, and what is the best way to solve this?
On-Page Optimization | | netsites0 -
Large block of several thousand words on homepage of Ecommerce site - opinions?
I work for a local SEO company who are continually adding content to the home page of clients websites. While i do agree that it is a good idea to add content to the home page and to link to inner pages, the home page of several clients exceeds 6,000 or more words. Every month, an article based on a brand or category (which already has its own page which is optimised with an article) is added to the home page with links to inner pages. I have stressed that it is not a good idea to have so much content on the home page, and even more so that it shouldn't target all the same keywords. I have pointed out that many of the brand and category pages do not rank well for their keywords, whereas in most cases it is the home page which is ranking instead, which i have suggested this is because the home page is too well optimised for those keywords. What are your opinions on this? Are you for or against continually adding content (which already has its own designated page) to the home page of an Ecommerce website? I should also add that this content is within a drop down div box at the footer on some sites and just above the footer on other sites. The category and brand pages also have drop down divs with an article just below the header.
On-Page Optimization | | Kinsel0 -
Right way to block google robots from ppc landing pages
What is the right way to completely block seo robots from my adword landing pages? Robots.txt does not work really good for that, as far I know. Adding metatags noindex nofollow on the other side will block adwords robot as well. right? Thank you very much, Serge
On-Page Optimization | | Kotkov0 -
In my report of my website it was indicated that I had 19 links/locations blocked by meta-robots. What does this mean and how do I fix it. My website is a Wordpress website.
In my report of my website it was indicated that I had 19 links/locations blocked by meta-robots. What does this mean and how do I fix it. My website is a Wordpress website.
On-Page Optimization | | cyaindc0 -
Robots.txt: excluding URL
Hi, spiders crawl some dynamic urls in my website (example: http://www.keihome.it/elettrodomestici/cappe/cappa-vision-con-tv-falmec/714/ + http://www.keihome.it/elettrodomestici/cappe/cappa-vision-con-tv-falmec/714/open=true) as different pages, resulting duplicate content of course. What is syntax for disallow these kind of urls in robots.txt? Thanks so much
On-Page Optimization | | anakyn0