DMOZ help
-
So yesterday I got a DMOZ editor account. I would like to know if Google indexes the editor profile pages on DMOZ:
http://www.dmoz.org/public/profile?editor=
here are some examples
http://www.dmoz.org/public/profile?editor=thehelper
http://www.dmoz.org/public/profile?editor=raph3988
http://www.dmoz.org/public/profile?editor=skasselea
I would like to know if it is worth while to build up this page so it will pass link juice. And can anyone tell me how frequently Google crawls for new editors (if that's possible?)
-
Hello,
I wouldn't bet on it, but there's no harm in trying
-
You can confirm this yourself.
First, do a Google search for site:http://www.dmoz.org/public/profile?editor=
You see the meta descriptions aren't indexed in the results? Instead, Google puts a default message, with a link to this page.. https://support.google.com/webmasters/bin/answer.py?hl=en&answer=156449 - check that out. Note the paragraph:
"While Google won't crawl or index the content of pages blocked by robots.txt, we may still index the URLs if we find them on other pages on the web. As a result, the URL of the page and, potentially, other publicly available information such as anchor text in links to the site, or the title from the Open Directory Project (www.dmoz.org), can appear in Google search results."
So whilst they may appear in Google's index (and indeed the OSE one) because of the links pointing to them, the content isn't crawled at all (by any spiders that obey robots.txt).
-
Oh yes he is correct, good call Neil, I had no idea that the robot.txt would be publicly accessible. I actually never seen a site have their robot txt visible.. I guess it's the "open source"...
-
Can anyone co firm this?
-
Take a look at their robots.txt - http://www.dmoz.org/robots.txt
They disallow the /public and /editors subfolders. The editor pages, whilst indexed by Google, aren't crawled.. so whilst the location of the pages themselves is indexed (because of links to those pages), the contents of the pages aren't indexed. This means any links on the page too, obviously..
For this reason, I don't agree with Reload Media. For me, there's no point expending any effort promoting the page for link equity benefit.
The fact they show good authority on OSE is something of an anomoly. They can accrue authority (and indeed Google PR) from their inbound links, however, they are a bit of a dead end, due to the fact that no actual content is indexed.
-
Hi Raphael,
Well done on getting an editor account. Remember with great power comes great responsibility
Yes they do get indexed. The way to check this is to Google the url in "" i.e. "http://www.opensiteexplorer.org/links?site=www.dmoz.org%2Fpublic%2Fprofile%3Feditor%3Dthehelper"
Some of those editor pages have great Authority. http://www.opensiteexplorer.org/links?site=www.dmoz.org%2Fpublic%2Fprofile%3Feditor%3Dthehelper
If it's related to your niche, then would be worth pursuing.
Hope that helps
Iain - Reload Media
-
Using http://pro.seomoz.org/tools/on-page-keyword-optimization, you can check individual pages, in the keyword field I put the helper and in the URL I put http://www.dmoz.org/public/profile?editor=thehelper... So it seems like it does get indexed : )
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can cross domain canonicals help with international SEO when using ccTLDs?
Hello. My question is:** Can cross domain canonicals help with international SEO when using ccTLDs and a gTLD - and the gTLD is much more authoritative to begin with? ** I appreciate this is a very nuanced subject so below is a detailed explanation of my current approach, problem, and proposed solutions I am considering testing. Thanks for the taking the time to read this far! The Current setup Multiple ccTLD such as mysite.com (US), mysite.fr (FR), mysite.de (DE). Each TLD can have multiple languages - indeed each site has content in English as well as the native language. So mysite.fr (defaults to french) and mysite.fr/en-fr is the same page but in English. Mysite.com is an older and more established domain with existing organic traffic. Each language variant of each domain has a sitemap that is individually submitted to Google Search Console and is linked from the of each page. So: mysite.fr/a-propos (about us) links to mysite.com/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in French. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc) mysite.fr/en-fr/about-us links to mysite.com/en-fr/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in English. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc). There is more English content on the site as a whole so the English version of the sitemap is always bigger at the moment. Every page on every site has two lists of links in the footer. The first list is of links to every other ccTLD available so a user can easily switch between the French site and the German site if they should want to. Where possible this links directly to the corresponding piece of content on the alternative ccTLD, where it isn’t possible it just links to the homepage. The second list of links is essentially just links to the same piece of content in the other languages available on that domain. Mysite.com has its international targeting in Google Search console set to the US. The problems The biggest problem is that we didn’t consider properly how we would need to start from scratch with each new ccTLD so although each domain has a reasonable amount of content they only receive a tiny proportion of the traffic that mysite.com achieves. Presumably this is because of a standing start with regards to domain authority. The second problem is that, despite hreflang, mysite.com still outranks the other ccTLDs for brand name keywords. I guess this is understandable given the mismatch of DA. This is based on looking at search results via the Google AdWords Ad Preview tool and changing language, location, and domain. Solutions So the first solution is probably the most obvious and that is to move all the ccTLDs into a subfolder structure on the mysite.com site structure and 301 all the old ccTLD links. This isn’t really an ideal solution for a number of reasons, so I’m trying to explore some alternative possible routes to explore that might help the situation. The first thing that came to mind was to use cross-domain canonicals: Essentially this would be creating locale specific subfolders on mysite.com and duplicating the ccTLD sites in there, but using a cross-domain canonical to tell Google to index the ccTLD url instead of the locale-subfolder url. For example: mysite.com/fr-fr has a canonical of mysite.fr
Intermediate & Advanced SEO | | danatello
mysite.com/fr-fr/a-propos has a canonical of mysite.fr/a-propos Then I would change the links in the mysite.com footer so that they wouldn’t point at the ccTLD URL but at the sub-folder URL so that Google would crawl the content on the stronger domain before indexing the ccTLD domain version of the URL. Is this worth exploring with a test, or am I mad for even considering it? The alternative that came to my mind was to do essentially the same thing but use a 301 to redirect from mysite.com/fr-fr to mysite.fr. My question is around whether either of these suggestions might be worth testing, or am I completely barking up the wrong tree and liable to do more harm than good?0 -
Will more comprehensive content on product pages help improve ranking?
We're working to improve the ranking of one of our product landing pages. The page that currently ranks #1 has a very simple, short layout with the main keyword many times on the page with otherwise very little text. One thought we had was to make a more comprehensive page including more info on the features and benefits of the product. The thought being that a longer form page would be more valuable and potentially look better to Google if the other SEO pieces are on par. Does that make sense to do? Or would it be better to keep the product page simple and make some more related content on our blog linking back to that landing page? Thanks in advance to any help you can provide!
Intermediate & Advanced SEO | | Bob_Kastner0 -
URL Index Removal for Hacked Website - Will this help?
My main question is: How do we remove URLs (links) from Google's index and the 1000s of created 404 errors associated with them after a website was hacked (and now fixed)? The story: A customer came to us for a new website and some SEO. They had an existing website that had been hacked and their previous vendor was non-responsive to address the issue for months. This created THOUSANDS of URLs on their website that were then linked to pornographic and prescription med SPAM sites. Now, Google has 1,205 pages indexed that create 404 errors on the new site. I am confident these links are causing Google to not rank well organically. Additional information: Entirely new website Wordpress site New host Should we be using the "Remove URLs" tool from Google to submit all 1205 of these pages? Do you think it will make a difference? This is down from the 22,500 URLs that existed when we started a few months back. Thank you in advance for any tips or suggestions!
Intermediate & Advanced SEO | | Tosten0 -
Manual Penalty Reconsideration Request Help
Hi All, I'm currently in the process of creating a reconsideration request for an 'Impact Links' manual penalty. So far I have downloaded all LIVE backlinks from multiple sources and audited them into groups; Domains that I'm keeping (good quality, natural links). Domains that I'm changing to No Follow (relevant good quality links that are good for the user but may be affiliated with my company, therefore changing the links to no follow rather than removing). Domains that I'm getting rid of. (poor quality sites with optimised anchor text, directories, articles sites etc.). One of my next steps is to review every historical back link to my website that is NO LONGER LIVE. To be thorough, I have planned to go through every domain (even if its no longer linking to my site) that has previously linked and straight up disavow the domain (if its poor quality).But I want to first check whether this is completely necessary for a successful reconsideration request? My concerns are that its extremely time consuming (as I'm going through the domains to avoid disavowing a good quality domain that might link back to me in future and also because the historical list is the largest list of them all!) and there is also some risk involved as some good domains might get caught in the disavowing crossfire, therefore I only really want to carry this out if its completely necessary for the success of the reconsideration request. Obviously I understand that reconsideration requests are meant to be time consuming as I'm repenting against previous SEO sin (and believe me I've already spent weeks getting to the stage I'm at right now)... But as an in house Digital Marketer with many other digital avenues to look after for my company too, I can't justify spending such a long time on something if its not 100% necessary. So overall - with a manual penalty request, would you bother sifting through domains that either don't exist anymore or no longer link to your site and disavow them for a thorough reconsideration request? Is this a necessary requirement to revoke the penalty or is Google only interested in links that are currently or recently live? All responses, thoughts and ideas are appreciated 🙂 Kind Regards Sam
Intermediate & Advanced SEO | | Sandicliffe0 -
I need help with a local tax lawyer website that just doesn't get traffic
We've been doing a little bit of linkbuilding and content development for this site on and off for the last year or so: http://www.olsonirstaxattorney.com/ We're trying to rank her for "Denver tax attorney," but in all honesty we just don't have the budget to hit the first page for that term, so it doesn't surprise me that we're invisible. However, my problem is that the site gets almost NO traffic. There are days when Google doesn't send more than 2-3 visitors (yikes). Every site in our portfolio gets at least a few hundred visits a month, so I'm thinking that I'm missing something really obvious on this site. I would expect that we'd get some type of traffic considering the amount of content the site has, (about 100 pages of unique content, give or take) and some of the basic linkbuilding work we've done (we just got an infographic published to a few decent quality sites, including a nice placement on the lawyer.com blog). However, we're still getting almost no organic traffic from Google or Bing. Any ideas as to why? GWMT doesn't show a penalty, doesn't identify any site health issues, etc. Other notes: Unbeknownst to me, the client had cut and pasted IRS newsletters as blog posts. I found out about all this duplicate content last November, and we added "noindex" tags to all of those duplicated pages. The site has never been carefully maintained by the client. She's very busy, so adding content has never been a priority, and we don't have a lot of budget to justify blogging on a regular basis AND doing some of the linkbuilding work we've done (guest posts and infographic).
Intermediate & Advanced SEO | | JasonLancaster0 -
Do I use a .org or .co.uk domain to help UK rankings?
Hi Guys, I own to good domains one with a .ORG and the other .CO.UK Can anyone advise which one is best to use to help UK rankings? Or does it not make much difference?? Thanks guys Gareth
Intermediate & Advanced SEO | | GAZ090 -
Help needed on Google Webmaster tools
Hi I notice that one of my oldest sites , even if I put hundred of backlinks (good or bad) google webmaster tools never index more like 20 per day. Why is this happening? They control it? I mean they dont let them all to get indexed and they take it slowly slowly? If I put just 20 per day is the ideal link building amount? Thnk you
Intermediate & Advanced SEO | | nyanainc0 -
Need a trained eye to help with a quick search to see if there’s a poison pill buried somewhere on my site!
Need a trained eye to help with a quick search to see if there’s a poison pill buried somewhere on my site! This is an e-commerce site that I’ve worked on and ran for 5 years which ranks from middle to top in just about all of the quality analytic scores when compared to top 10 competitors in Google, yet this site can hardly stay on the 3<sup>rd</sup> page let alone the 1<sup>st</sup>. Only weakness in metrics that I see is that I need more linking root domains and traffic. Any suggestions will be greatly appreciated. Lowell
Intermediate & Advanced SEO | | lwnickens0