Cleaning up user generated nofollow broken links in content.
-
We have a question/answer section on our website, so it's user generated content. We've programmed all user generated links to be nofollow. Over time... we now have many broken links and some are even structurally invalid. Ex. 'http:///.'. I'm wanting to go in and clean up the links to improve user experience, but how do I justify it from an SEO standpoint and is it worth it?
-
Applying Broken Windows Theory to SEO is such an underrated tactic. It's totally worth the time. Will you be able to directly attribute revenue to the cleanup? Probably not. Will it improve the overall quality and user experience of the site? Absolutely, 100%, and that's where it becomes an SEO play - because that better quality and better UX exactly what Google is aiming to reward in the long run. And because your site no longer looks like an easy mark for spammers, it should attract less spam in the long run.
Also, adding to MattAntonino's comment, Paul Haahr said a few weeks ago that the quality rater guidelines are basically Google's ideal algorithm, so you can count on Google working to incorporate as much of that as they can into the algorthm over time as they figure out how to automate it instead of relying on human maintenance. So even if it's not there now, count on it being there in the future. Future-proofing is always a good idea.
-
I would definitely argue in favor of this. Cleaning up broken links, changing the copyright date on websites, adding new content - it all sends signals to Google that the site is maintained regularly and has active management. A site that is regularly updated is more valuable than one that is created and then left to rot.
While Matt Cutts said in 2013 (eons ago in SEO) that broken links weren't a ranking factor, the Google Search Quality Raters Handbook says they are a factor for manual review.
They actually say:
Webmasters need to make sure their websites function well for users as web browsers change. How can you tell that a website is being maintained and cared for? Poke around: Links should work, images should load, content should be added and updated over time, etc. Exercise caution relying on dates: Some webpages automatically display the current date. Rather than just looking for a recent date, search for evidence that effort is being made to keep the website up to date and running smoothly.
When the Raters Handbook says that, I fix broken links.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
HTTPs to HTTP Links
Hi Mozers, I have a question about the news that Google Chrome will start blocking mixed content starting in December 2019. That starting in December 2019, users that are presented insecure content will be presented a toggle allowing those Chrome users to unblock the insure resources that Chrome is blocking. And in January 2020, Google will remove that toggle option an will just start blocking mixed content or insecure web pages. Not sure what this means. What are the implications of this for a HTTPS page that has an HTTP link? Thanks, Yael
Intermediate & Advanced SEO | | yaelslater0 -
Best way to link to 1000 city landing pages from index page in a way that google follows/crawls these links (without building country pages)?
Currently we have direct links to the top 100 country and city landing pages on our index page of the root domain.
Intermediate & Advanced SEO | | lcourse
I would like to add in the index page for each country a link "more cities" which then loads dynamically (without reloading the page and without redirecting to another page) a list with links to all cities in this country.
I do not want to dillute "link juice" to my top 100 country and city landing pages on the index page.
I would still like google to be able to crawl and follow these links to cities that I load dynamically later. In this particular case typical site hiearchy of country pages with links to all cities is not an option. Any recommendations on how best to implement?0 -
Case Sensitive URLs, Duplicate Content & Link Rel Canonical
I have a site where URLs are case sensitive. In some cases the lowercase URL is being indexed and in others the mixed case URL is being indexed. This is leading to duplicate content issues on the site. The site is using link rel canonical to specify a preferred URL in some cases however there is no consistency whether the URLs are lowercase or mixed case. On some pages the link rel canonical tag points to the lowercase URL, on others it points to the mixed case URL. Ideally I'd like to update all link rel canonical tags and internal links throughout the site to use the lowercase URL however I'm apprehensive! My question is as follows: If I where to specify the lowercase URL across the site in addition to updating internal links to use lowercase URLs, could this have a negative impact where the mixed case URL is the one currently indexed? Hope this makes sense! Dave
Intermediate & Advanced SEO | | allianzireland0 -
I'm updating content that is out of date. What is the best way to handle if I want to keep old content as well?
So here is the situation. I'm working on a site that offers "Best Of" Top 10 list type content. They have a list that ranks very well but is out of date. They'd like to create a new list for 2014, but have the old list exist. Ideally the new list would replace the old list in search results. Here's what I'm thinking, but let me know if you think theres a better way to handle this: Put a "View New List" banner on the old page Make sure all internal links point to the new page Rel=canonical tag on the old list pointing to the new list Does this seem like a reasonable way to handle this?
Intermediate & Advanced SEO | | jim_shook0 -
Links from new sites with no link juice
Hi Guys, Do backlinks from a bunch of new sites pass any value to our site? I've heard a lot from some "SEO experts" say that it is an effective link building strategy to build a bunch of new sites and link them to our main site. I highly doubt that... To me, a new site is a new site, which means it won't have any backlinks in the beginning (most likely), so a backlink from this site won't pass too much link juice. Right? In my humble opinion this is not a good strategy any more...if you build new sites for the sake of getting links. This is just wrong. But, if you do have some unique content and you want to share with others on that particular topic, then you can definitely create a blog and write content and start getting links. And over time, the domain authority will increase, then a backlink from this site will become more valuable? I am not a SEO expert myself, so I am eager to hear your thoughts. Thanks.
Intermediate & Advanced SEO | | witmartmarketing0 -
PDF for link building - avoiding duplicate content
Hello, We've got an article that we're turning into a PDF. Both the article and the PDF will be on our site. This PDF is a good, thorough piece of content on how to choose a product. We're going to strip out all of the links to our in the article and create this PDF so that it will be good for people to reference and even print. Then we're going to do link building through outreach since people will find the article and PDF useful. My question is, how do I use rel="canonical" to make sure that the article and PDF aren't duplicate content? Thanks.
Intermediate & Advanced SEO | | BobGW0 -
Category Content Duplication
Does indexing category archive page for a blog cause duplications? http://www.seomoz.org/blog/setup-wordpress-for-seo-success After reading this article I am unsure.
Intermediate & Advanced SEO | | SEODinosaur0 -
Duplicate content
I have just read http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world and I would like to know which option is the best fit for my case. I have the website http://www.hotelelgreco.gr and every image in image library http://www.hotelelgreco.gr/image-library.aspx has a different url but is considered duplicate with others of the library. Please suggest me what should i do.
Intermediate & Advanced SEO | | socrateskirtsios0