Do we need to Disallow profiles from discussions or forums?
-
Hi,
We have a forum where users create different threads like any other community...ex..Moz. Thousands of pages are getting created. New threads and comments are Okay as they have relevant content. We are planning to "Disallow" all profile pages as they do not help with content relevancy and may dilute the link juice with thousands of such profile pages. Is this right way to proceed?
Thanks
-
We have never had to nofollow internal links within any of our websites, I am sure their are some circumstances where you could, but since Google's algorithm no longer "penalizes" links I don't see a need to as they are simply ignored. Just avoid spammy tactics like having 100 links under the footer or a bunch of affiliate links, etc.
-
Hi Lure,
Thanks for the answer. We are going to de index these profile pages. Just wondering about "nofollow" as nofollowing internal links is not always correct. What's your call on this?
-
I would if I were in your shoes. Having all profile and/or comment links be "do-follow links" can cause alot of spam and potentially be the downfall to the longevity and credibility of your forum.
-
I would normally say yes, but it really depends on how good your forum is. Sometimes people like to do reputation marketing and if the site is authoritative you want to show for your name's queries that you are part of an important website with lots of contributions. But if your profile pages are not that relevant or nice or provide any interest to your users, I don't see why you'd like to let google index and serve them
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Puzzling Penalty Question - Need Expert Help
I'm turning to the Moz Community because we're completely stumped. I actually work at a digital agency, our specialism being SEO. We've dealt with Google penalties before and have always found it fairly easy to identify the source the problem when someone comes to us with a sudden keyword/traffic drop. I'll briefly outline what we've experienced: We took on a client looking for SEO a few months ago. They had an OK site, with a small but high quality and natural link profile, but very little organic visibility. The client is an IT consultancy based in London, so there's a lot of competition for their keywords. All technical issues on the site were addressed, pages were carefully keyword targeted (obviously not in a spammy way) and on-site content, such as services pages, which were quite thin, were enriched with more user focused content. Interesting, shareable content was starting to be created and some basic outreach work had started. Things were starting to pick up. The site started showing and growing for some very relevant keywords in Google, a good range and at different levels (mostly sitting around page 3-4) depending on competition. Local keywords, particularly, were doing well, with a good number sitting on page 1-2. The keywords were starting to deliver a gentle stream of relevant traffic and user behaviour on-site looked good. Then, as of the 28th September 2015, it all went wrong. Our client's site virtually dropped from existence as far as Google was concerned. They literally lost all of their keywords. Our client even dropped hundreds of places for their own brand name. They also lost all rankings for super low competition, non-business terms they were ranking for. So, there's the problem. The keywords have not shown any sign of recovery at all yet and we're, understandably, panicking. The worst thing is that we can't identify what has caused this catastrophic drop. It looks like a Google penalty, but there's nothing we can find that would cause it. There are no messages or warnings in GWT. The link profile is small but high quality. When we started the content was a bit on the thin side, but this doesn't really look like a Panda penalty, and seems far too severe. The site is technically sound. There is no duplicate content issues or plaigarised content. The site is being indexed fine. Moz gives the site a spam score of 1 (our of 11 (i think that's right)). The site is on an ok server, which hasn't been blacklisted or anything. We've tried everything we can to identify a problem. And that's where you guys come in. Any ideas? Anyone seen anything similar around the same time? Unfortunately, we can't share our clients' site's name/URL, but feel free to ask any questions you want and we'll do our best to provide info.
Algorithm Updates | | MRSWebSolutions0 -
Celebrity Profile On The Side of Google For High Profile Person
Hello! When I google "Justin Timberlake" I see web search results and a sidebar. See image below: http://screencast.com/t/qwYeiFZQRzT How does one get their results to display like this? Is this something that Google creates automatically or is it something the celebrity initiates/creates on their behalf. Does the celebrity have any options to choose from as to what displays on this sidebar? What is this called? I look forward to your response. qwYeiFZQRzT
Algorithm Updates | | InternetRep0 -
A web audit for web traffic? Need answers please..
Hi, We are a PR agency based in Dubai and we produce a lot of web content. The website is build on ruby on rails and we have implemented keywords and SEO strategies but sadly the traffic pattern has not changed since the past three years. What surprised us today that we created a page 2-3 days ago for a client who is participating in Arab Health (a very prestigious healthcare event) and suddenly our page comes on top 3 on google.ae as well as google.com We are kind of convinced that there is something wrong with our code.. Do you think this could be a possibility? and the lack of change in the traffic pattern might not be an SEO issue but a code issue? What could be the possible reasons for this pattern? In such a scenario what would experts like you recommend we do? Do a SEO Audit? Web audit? code audit? hire a seo/ web / code consultant? Thanks - helpful answers are really appreciated and just btw if anyone feels they could professionally help us out of this mess, we are willing to work with him/her. Thanks in advance
Algorithm Updates | | LaythDajani0 -
Will we no longer need Location + Keyword? Do we even need it at all?
Prepare yourselves. This is a long question. With the rise of schema and Google Local+, do you think Google will now have enough data about where a business is located, so that when someone searches for, a keyword such as "Atlanta Hyundai dealers" a business in Atlanta that's website: has been properly marked up with schema (or microdata for business location) has claimed its Google Local+ has done enough downstream work in Local Search listings for its NAP (name, address, phone number) will no longer have to incorporate variations of "Atlanta Hyundai dealers" in the text on the website? Could they just write enough great content about how they're a Hyundai dealership without the abuse of the Atlanta portion? Or if they're in Boston and they're a dentist or lawyer, could the content be just about the services they provided without so much emphasis tied to location? I'm talking about removing the location of the business from the text in all places other than the schema markup or the contact page on the website. Maybe still keep a main location in the title tags or meta description if it would benefit the customer. I work in an industry where location + keywords has reached such a point of saturation, that it makes the text on the website read very poorly, and I'd like to learn more about alternate methods to keep the text more pure, read better and still achieve the same success when it comes to local search. Also, I haven't seen other sites penalized for all the location stuffing on their websites, which is bizarre because it reads so spammy you can't recognize where the geotargeted keywords end and where the regular text begins. I've been working gradually in this general direction (more emphasis on NAP, researching schema, and vastly improving the content on clients' websites so it's not so heavy with geo-targeted keywords). I also ask because though the niche I work in is still pretty hell-bent on using geo-targeted keywords, whenever I check Analytics, the majority of traffic is branded and geo-targeted keywords make up only a small fraction of traffic. Any thoughts? What are other people doing in this regard?
Algorithm Updates | | EEE30 -
Since authorship markup requires a domain email, how can a community website allow users to link their Google+ profile?
It seems that Google now requires authors to have a valid email on the domain. This is easy for the traditional web publication. But what about community websites like SEOmoz? How can a community website allow users to link their Google+ profile? Will community websites like SEOmoz be required to 1. Give all users a domain email 2. Ask users to validate the email address with Google? Seems overly complicated.
Algorithm Updates | | designquotes0 -
ECommerce site being "filtered" by last Panda update, ideas and discussion
Hello fellow internet go'ers! Just as a disclaimer, I have been following a number of discussions, articles, posts, etc. trying to find a solution to this problem, but have yet to get anything conclusive. So I am reaching out to the community for help. Before I get into the questions I would like to provide some background: I help a team manage and improve a number of med-large eCommerce websites. Traffic ranges anywhere from 2K - 12K+ (per day) depending on the site. Back in March one of our larger sites was "filtered" from Google's search results. I say "filtered" because we didn't receive any warnings and our domain was/is still listed in the first search position. About 2-3 weeks later another site was "filtered", and then 1-2 weeks after that, a third site. We have around ten niche sites (in total), about seven of them share an identical code base (about an 80% match). This isn't that uncommon, since we use a CMS platform to manage all of our sites that holds hundreds of thousands of category and product pages. Needless to say, April was definitely a frantic month for us. Many meetings later, we attributed the "filter" to duplicate content that stems from our product data base and written content (shared across all of our sites). We decided we would use rel="canonical" to address the problem. Exactly 30 days from being filtered our first site bounced back (like it was never "filtered"), however, the other two sites remain "under the thumb" of Google. Now for some questions: Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content? Is it a coincidence that it was an exact 30 day "filter"? Why has only one site recovered?
Algorithm Updates | | WEB-IRS1 -
I need help with drastic SERP difference between Bing and Google
One of our sites that has been around for a couple of years has about 60,000 pages showing on google, however, bing only shows 90 pages for the site. This same phenomenon has been happening across the board for our sites. Any ideas to improve our indexing results for bing?
Algorithm Updates | | atuomala0 -
Do we need to worry about where our domain is hosted anymore?does it make a difference anymore?
I went to a really interesting conference last week and one of the speakers who has been working in the SEO industry for 15 years now said that it doesn't make a difference anymore ranking wise. I would like to see what the community thinks on this subject? Thanks Ari
Algorithm Updates | | dublinbet0