Old documents online as link juice
-
Each month I upload my auction catalog in different formats (word, pdf and rtf).
I have about 9 years of catalogues online that have all been indexed by Google.
In each catalog there is a link to my terms and conditions page (which has made the page authority for that page quite high in some unusual, but desired keywords), there is also many, many mentions of non-desired keywords in each of those documents and links to my domain.
Is it worth updating all these old, previously indexed catalogues with better keyword juice and more relevant links ?
Would they even get re-visited by google ?
I suppose that leads to the next question... is it worth adding each of these pages to my sitemap ? To this point I have only added my major pages, not any of the subordinate pages etc.
-
That's a good set of resources indeed.
I would move forward an update part of those pages - step by step. (as long as you don't go overboard - keep the links on subject, don't over optimize and linkenize
Is it worth updating all these old, previously indexed catalogues with better keyword juice and more relevant links ?
** Relevant links yes. metter keyword juice no as you would have the tendency to over optimize. Keep it very simple and easy as those pages with new links could indeed bring a lot of value.
Again, it should really be done step by step an you should monitor the results - going overboard it might back fire.
Would they even get re-visited by google ?
** If those pages don't have any updates the crawl rate is probably low but they will get updates. You can always help with a ping, build a sitemap and submit via your Web master tools account too.
You can also check when google bot visited some of those pages if you have access to the logs.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal Linking
Hello there, I own a "how to" website with 1000+ articles, and the number of articles is growing every day. Often some articles are easier to understand if I link a certain step to an article that was written before, because that article explains the step in more detail. Should I use "read here/read more" or the "title of the article I'm referring to" as anchor text? When is internal linking too much? Should I use nofollow?
Technical SEO | | FisnikSylka0 -
Developing a link profile.....
So we are a brand new site looking to establish a link profile of earned links vs. manipulative link building practices and have received some conflicting information. Our goal is to provide users and webmasters of relevant websites with useful content about the areas and topics we cover and let them decide to link to us. We have been advised by some parties that in order to develop a base set of links we should enter our website into directories. Now I understand entering it into some of the main directories such as BOTW and Yahoo etc, but please offer your thoughts on smaller less official directories. Thanks in advance. Scott
Technical SEO | | jackaveli0 -
Links to Website Author
I'm a website developer, and in the past I have usually added a tiny backlink to the footer of my clients' websites like this: Website Design by MyCompanyName I understand that Google sees this as a low-quality backlink. However, I was wondering if such links can hurt my rankings. Does Penguin sees these links as spam? If so, should I add a rel="nofollow" to the links? Is there anything else I should change? I do not want to remove these links completely because they are good for marketing my business. I just want to minimize any negative SEO impact of the links. I appreciate your input. Thanks.
Technical SEO | | SiteWizard_LLC0 -
No results with Link Analysis
So I have been working with a domain since November last year that still shows no improvement in regards to the link analysis. I am baffled because we have gotten them onto the first page on Google for a few of the keywords we are optimizing. Any help with this is greatly appreciated and I am a noob so definitely open to learning. Thanks in advance to all of you. Domain in question - www.modernportablerefrigeration.com Domain is currently on a shared server if that makes any difference. Cordially, Todd Richard [email protected]
Technical SEO | | RichFinnSEO0 -
Too many on page links
Hi All, As we all know, having to much links on a page is an obstacle for search engine crawlers in terms of the crawl allowance. My category pages are labeled as pages with to many "one page" links by the SEOmoz crawler. This probably comes from the fact that each product on the category page has multiple links (on the image and model number). Now my question is, would it help to setup a text-link with a clickable area as big as the product area? This means every product gets just one link. Would this help get the crawlers deeper in these pages and distribute the link-juice better? Or is Google smart enough already to figure out that two links to the same product page shouldn't be counted as two? Thanks for your replies guys. Rich
Technical SEO | | Horlogeboetiek0 -
Is link cloaking bad?
I have a couple of affiliate gaming sites and have been cloaking the links, the reason I do this is to stop have so many external links on my sites. In the robot.txt I tell the bots not to index my cloaked links. Is this bad, or doesnt it really matter? Thanks for your help.
Technical SEO | | jwdesign0 -
No. of links on a page
Is it true that If there is a huge number of links from the source page then each link will provide very little value in terms of passing link juice ?
Technical SEO | | seoug_20050 -
Which version of pages should I build links to?
I'm working on the site www.qualityauditor.co.uk which is built in Moonfruit. Moonfruit renders pages in Flash. Not ideal, I know, but it also automatically produces an HTML version of every page for those without Flash, Javascript and search engines. This HTML version is fairly well optimised for search engines, but sits on different URLs. For example, the page you're likely to see if browsing the site is at http://www.qualityauditor.co.uk/#/iso-9001-lead-auditor-course/4528742734 However, if you turn Javascript off you can see the HTML version of the page here <cite>http://www.qualityauditor.co.uk/page/4528742734</cite> Mostly, it's the last version of the URL which appears in the Google search results for a relevant query. But not always. Plus, in Google Webmaster Tools fetching as Googlebot only shows page content for the first version of the URL. For the second version it returns HTTP status code and a 302 redirect to the first version. I have two questions, really: Will these two versions of the page cause my duplicate content issues? I suspect not as the first version renders only in Flash. But will Google think the 302 redirect for people is cloaking? Which version of the URL should I be pointing new links to (bearing in mind the 302 redirect which doesn't pass link juice). The URL's which I see in my browser and which Google likes the look at when I 'fetch as Googlebot'. Or those Google shows in the search results? Thanks folks, much appreciated! Eamon
Technical SEO | | driftnetmedia0