How Google differentiates web sites like directories?
-
Hi,
I want to ask how google differentiates web sites like directories or company listing websites? How it understands that is a normal thing to have many links in a directory site? Are there some guides links about what to do and avoid and how to make SEO optimization for a directory web site.
-
I'm afraid I've seen issues with even decent quality directories. I have a client who originally published a print directory and then moved it to the web - they have relationships with all of the companies whose "products" they list (it's not a product directory, but that doesn't matter much for this discussion), they use unique descriptions, and they really focus on quality. Still, they've seen gradual declines, as the OEMs (essentially) rank higher and higher.
I think Google is just starting to view the directory model in general with more and more suspicion. They'd rather land someone directly on a result or answer then go through what's essentially another search engine. That's not to say you're automatically penalized, but it's a tougher model to succeed at than it was even 5 years ago.
I think, ultimately, you've really got to show some kind of unique value. If you're basically just listing other people's information, your chances of making much of a splash in the rankings aren't great these days.
-
Right now, I can tell you that I have a message proving that directories will indeed get you penalized. I had a pr4 51 authority link on a directory site that was very popular. Google doesn't generally give you an hints on links, but they did and I have proof. They told me to take specific directory links down that we built to very relevant sections. Not only that, but even told me that my main url of my site I used as the keywords was still spam.
So, Google I feel is hand picking the directory sites like DMOZ, etc, etc. I also don't feel directory sites are that high of a quality in a link anyways. What can you really say in 30 or so words about your site.
I am not saying to stay away from them all, I am just saying be very careful, because I thought they were ok to use if you picked good ones too, but it's not.
Have a great night.
MB
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why ranks my site so bad?
hello together, i think, that you read questions like this very often, but i hope someone has some good ideas for me. https://goo.gl/3iOmcqy with the keyword "sophos sg 210" we rank very bad. But i don't know why 😞 We have trust elements, a very good "avergage time on page" unique content... But i don't know, what can we make better 😞 Thx you so much
On-Page Optimization | | enbitcon0 -
Improving the search function on my site...
Hi all, The search function on my site is pretty bad... it basically lists every single product for any search query. Has anyone got experience integrating a 3rd party tool such as Google Custom Search or Pardot and if so which would you recommend? Alternatively, any tips on improving/ creating rules for site search would be appreciated. https://www.boardwarehouse.co.uk/ Thanks, Alick
On-Page Optimization | | Alick3000 -
I'm looking to put a quite length FAQs tab on product pages on an ecommerce site. Am I likely to have duplicate content issues?
On an ecommerce site we have unique content on the product pages (i.e. descriptions), as well as the usual delivery and returns tabs for customer convenience. From this we haven't had any duplicate content issues or warnings, which seems to be the case industry-wide. However, we're looking to add a more lengthy FAQs tab which is still highly relevant to the customer but contains a lot more text than the other tabs. The product descriptions are also relatively small. Do you think this will cause potential duplicate content issues or should it be treated the same as a delivery tab, for instance?
On-Page Optimization | | creativemay0 -
How to use canonical with mobile site to main site
I am pretty sure that the mobile version of the main site needs to be the same canonical link from what I understand. I am trying to find good docuementation that supports this. Even better if its from Google or Matt Cutts. I have a main domain like http://www.mydomain.com the mobile version of this is http://www.mydomain.com/m/ Should my canonical be rel="canonical" href="http://www.mydomain.com"/> for both these pages?
On-Page Optimization | | cbielich0 -
On Site Problem Caused Rankings To Drop?
I'm getting to the bottom of why my site dropped rankings on the 9th March. I noticed that in google there is a cached version of my site on 9th March at 8.37am which is when my rankings dissapeared. Presumably this is when google last crawled my site? I guess this means that google found something on the home page or on the site that it didn't like? I wonder if anyone can take a look and let me know if there's anything obvious. Could it be a duplicate content penalty as I have lots of categories pulling content from the same posts?
On-Page Optimization | | SamCUK0 -
Google Index Report
Hi, I have just checked my google webmaster tools account and viewed the index status of my website and it produced the attached graph, which show quite a big spike in indexing during July and August 2012. Does this look normal or does it reveal anything peculiar? We did have a new website launched in June 2012 and I re-submitted the sites URL's to google as part of the re-launch and so I am unsure if this may account for the spike. Any advice appreciated. Thanks indexing.png
On-Page Optimization | | UnderMe0 -
Large Site - Advice on Subdomaining
I have a large news site - over 1 million pages (have already deleted 1.5 million) Google buries many of our pages, I'm ready to try subdomaining http://bit.ly/dczF5y There are two types of content - news from our contributors, and press releases. We have had contracts with the big press release companies going back to 2004/5. They push releases to us by FTP or we pull from their server. These are then processed and published. It has taken me almost 18 months, but I have found and deleted or fixed all the duplicates I can find. There are now two duplicate checking systems in place. One runs at the time the release comes in and handles most of them. The other one runs every night after midnight and finds a few, which are then handled manually. This helps fine-tune the real-time checker. Businesses often link to their release on the site because they like us. Sometimes google likes this, sometimes not. The news we process is reviews by 1,2 or 3 editors before publishing. Some of the stories are 100% unique to us. Some are from contributors who also contribute to other news sites. Our search traffic is down by 80%. This has almost destroyed us, but I don't give up easily. As I said, I've done a lot of projects to try to fix this. Not one of them has done any good, so there is something google doesn't like and I haven't yet worked it out. A lot of people have looked and given me their ideas, and I've tried them - zero effect. Here is an interesting and possibly important piece of information: Most of our pages are "buried" by google. If I dear, even for a headline, even if it is unique to us, quite often the page containing that will not appear in the SERP. The front page may show up, an index page may show up, another strong page pay show up, if that headline is in the top 10 stories for the day, but the page itself may not show up at all - UNTIL I go to the end of the results and redo the search with the "duplicates" included. Then it will usually show up, on the front page, often in position #2 or #3 According to google, there are no manual actions against us. There are also no notices in WMT that say there is a problem that we haven't fixed. You may tell me just delete all of the PRs - but those are there for business readers, as they always have been. Google supposedly wants us to build websites for readers, which we have always done, What they really mean is - build it the way we want you to do it, because we know best. What really peeves me is that there are other sites, that they consistently rank above us, that have all the same content as us, and seem to be 100% aggregators, with ads, with nothing really redeeming them as being different, so this is (I think) inconsistent, confusing and it doesn't help me work out what to do next. Another thing we have is about 7,000+ US military stories, all the way back to 2005. We were one of the few news sites supporting the troops when it wasn't fashionable to do so. They were emailing the stories to us directly, most with photos. We published every one of them, and we still do. I'm not going to throw them under the bus, no matter what happens. There were some duplicates, some due to screwups because we had multiple editors who didn't see that a story was already published. Also at one time, a system code race condition - entirely my fault, I am the programmer as well as the editor-in-chief. I believe I have fixed them all with redirects. I haven't sent in a reconsideration for 14 months, since they said "No manual spam actions found" - I don't see any point, unless you know something I don't. So, having exhausted all of the things I can think of, I'm down to my last two ideas. 1. Split all of the PRs off into subdomains (I'm ready to pull the trigger later this week) 2. Do what the other sites do, that I believe create little value, which is show only a headline and snippet and some related info and link back to the original page on the PR provider website. (I really don't want to do this) 3. Give up on the PRs and delete them all and lose another 50% of the income, which means releasing our remaining staff and upsetting all of the companies and people who linked to us. (Or find them all and rewrite them as stories - tens of thousands of them) and also throw all our alliances under the bus (I really don't want to do this) There is no guarantee this is the problem, but google won't tell me, the google forums are crap, and nobody else has given me an idea that has helped. My thought is that splitting them off into subdomains will have a number of effects. 1. Take most of the syndicated content onto subdomains, so its not on the main domain. 2. Shake up the Domain Authority 3. Create a million 301 redirects. 4. Make it obvious to the crawlers what is our news and what is PRs 5. make it easier for Google News to understand Here is what I plan to do 1. redirect all PRs to their own subdomain. pn.domain.com for PRNewswire releases bw.domain.com for Businesswire releases etc 2. Fix all references so they use the new subdomain Here are my questions - and I hope you may see something I haven't considered. 1. Do you have any experience of doing this? 2. What was the result 3. Any tips? 4. Should I put PR index pages on the subdomains too? I was originally planning to keep them on the main domain, with the individual page links pointing to the actual release on the subdomain. Obviously, I want them only in one place, but there are two types of these index pages. a) all of the releases for a particular PR company - these certainly could be on the subdomain and not on the main domain b) Various category index pages - agriculture, supermarkets, mining etc These would have to stay on the main domain because they are a mixture of different PR providers. 5. Is this a bad idea? I'm almost out of ideas. Should I add a condensed list of everything I've done already? If you are still reading, thanks for hanging in.
On-Page Optimization | | loopyal0 -
Major update to site architecture (outline)-Is Google going to drop?
I'm working with a lawyer client who has a table-based, outdated site. Her nav links consist of a jumble of topics and static pages in one long sidebar list on the home page. I'm moving her site to Wordpress and I've recommended that she organize the site based on categories that roughly match the topics/keywords she wants to rank highest for in Google. The site will be much better organized and coded and the URLs for the new launch will be much stronger for SEO by being targeted and coded properly. So the site should rank better after, right? Right??? I know that when Google crawls the new architecture, it's not going to find the expected long sidebar list of internal nav links. It'll find better, more keyword targeted internal nav links. But will that keep the site from getting dropped off page 1? I'm speaking w/ the client tomorrow and if she's going to drop or get bounced around, I feel like I should prepare her and let her know roughly what might happen. I'm thinking based on my current understanding that I should tell her to expect to be bounced around for a few weeks, but in the end she should rank higher than before. What would you do/say?
On-Page Optimization | | bvrob0