Large volume of ning files in subdomain - hurting or helping?
-
I have a client that has 600 pages in their root domain and a subdomain that contains 7500 pages of un-seoable Ning pages. PLUS another 650 pages from Sched.com that also is contributing to a large volume of errors.
My question is - should I create a new domain for the Ning content - or am I better off with the volume of pages - even if they have loads of errors?
Thanks!
-
Heya,
I don't know what 'Sched.com' is as there's nothing on that domain, or what you mean by a 'Ning' file, but applying basic rules
- Do what you can/have to, to reduce errors on the site - this may involve restructuring the site or moving files around
- You don't need new domains for storing content, sub-domains or sub-folders will suffice
- Having content/files which are not 'SEO-able' is not an issue. If you focus on the user's experience of the website, reduce clutter and errors and ensure the site is easily crawlable then you are getting things off on the right footing.
- 600 pages in a root domain is crazy, but if they are named helpfully then it doesn't necessarily have to be a problem. I often have sites where an index.php governs the site and then all the content is stored in a sub-folder. It's not necessarily where the files are stored, but how they are managed and organised that makes a difference to the webmater, website visitors and indeed, search engines.
- You should be able to fix errors without moving pages off-site, else why have them anywhere?
Hope this helps in some way
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it helpful for seo to have helpful links at the footer?
HI, suppose my homepage has good content but no external links to other sites for more info. and no helpful internal links in the footer to learn more. and my competition has 9 internal links in the footer which links to other pages on the site who has more SEO boost? I know the answer , does it really makes a difference or its minute?
Intermediate & Advanced SEO | | SIMON-CULL1 -
Best way to handle deletion of a forum subdomain?
Hello All Our site www.xxxx.com has long had a forum subdomain forum.xxxx.com. We have decided to sunset the forum. We find that the 'Ask a Question' function on product pages and our social media presence are more effective ways of answering customers' product & project technical Qs. Simply shutting down the forum server is going to return thousands of 404s for forum.xxxx.com, which I can't imagine would be helpful for the SEO of www.xxxx.com even though my understanding is that subdomains are sort of handled differently than the main site. We really tremendously on natural search traffic for www.xxxx.com, so I am loathe to make any moves that would hurt us. I was thinking we should just keep the forum server up but return 410s for everything on it, including the roughly ~3,000 indexed pages until they are removed from the index, then shut it down. The IT team also gave the option of simply pointing the URL to our main URL, which sorta scares me because it would then 200 and return the same experience hitting it from forum.xxxx.com as www.xxxx.com, which sounds like a very bad idea. (Yes, we do have canonicals on www.xxxx.com). In your opinion, what is the best way to handle this matter? Thank You
Intermediate & Advanced SEO | | jamestown0 -
Orphan Duplicate is created as Subdomain in Google Search
We noticed that some of our results on google for the blog are also come up with subdomain that is not linked from anywhere on the website. For example: SUBDOMAIN1.website.com/blog/content.html -> it redirects to website.com/blog/content.html SUBDOMAIN1 is not linked anywhere on the website. How did the google find it in the first place? Why does it still keep it in the search results? How do you get rid of it?
Intermediate & Advanced SEO | | rkdc0 -
Please help need experienced eyes
We own discount banner printing and we are trying to rank 1 for pvc banners or vinyl banners and cannot understand for example how the below is correct, we did suffer a link penalty years ago but we fixed this and the domain has some good links (more and better quality than the sites above us) and cannot understand how we rank below most of the sites above us? If we type on for example pvc banners we get http://www.bannershop.co.uk/cats/pvc_banners.htm https://www.hfe-signs.co.uk/banners.php http://bannerprintingandroid.co.uk/pvc-banners/ http://www.discountbannerprinting.co.uk/banners/vinyl-pvc-banners.html (our website) And if we type in vinyl banners we get http://www.vistaprint.co.uk/banners.aspx http://www.bigvaluebanners.co.uk/ http://vinylbannersprinting.co.uk/ http://www.discountdisplays.co.uk/html/vinyl_banners.html https://www.buildasign.co.uk/banners http://www.monkey-print.com/outdoor banners/budget-outdoor-banners http://www.discountbannerprinting.co.uk/banners/vinyl-pvc-banners.html (Our website)
Intermediate & Advanced SEO | | BobAnderson0 -
#! (hashbang) check help needed
Does anybody have experience using hashbang? We tried to use it to solve indexation problem and I'm not fully sure do we use right solution now (developers did it with these FAQ and Guide to Ajax crawling as information source). One of our client has problem, that their e-shop categories, has solution where search engines aren't able to index all products. In this example a category, there is this "Näita kõiki (38)" that shows all category products for users but as I understand search engines aren't able to index it as /et#/activeTab=tab02 because of #. Now there is used #! (hashbang) and it is /et#!/activeTab=tab02. Is this correct solution? Also now example category URL is defferent for better indexation with:
Intermediate & Advanced SEO | | raido
/et#!/
../et And when tabs "TOP ja uued" and "Näita kõik" where activated/clicked then:
/et#/activeTab=tab01
/et#/activeTab=tab02 I tried to fetch it in Google Webmaster Tools but it seems it didn't work. I would appreciate it if anybody can check this solution?0 -
Need help creating sitemap
Hello, The details of my question is sitemap related. Below is the background info: we are ecommerce site with around 4000 pages, and 20000 images. we dont have a sitemap implemented on our site yet. i have checked alot of sitemap tools out there, like g-sitecrawler, xml sitemap, a1 sitemap builder etc, and i tried to create sitemaps via them, but all them give different results. the major links are all there, but the results start to vary for level 2, level 3 links and so on. plus no matter how much i read up on sitemaps, the more i am getting confused. i read lots of seomoz articles on sitemaps, and due to my limited seo and technical knowledge, the extra information on these articles gets more confusing. i also just read an article on seomoz that instead of having one sitemap, having multiple smaller sitemaps is very good idea, specially if we are adding lots of new products (which we are). Now my question: My question is having understood the immense value of sitemap (and by having it very poorly implemented before), how can i make sure that i get a very good sitemap (both xml and html sitemap). i do not want to do something again and just repeat old mistakes by having a poorly implemented sitemap for our site. I am hoping that one of the professionals out there, can help me also make and implement the sitemap. If you can please point me to the right direction.
Intermediate & Advanced SEO | | kannu10 -
Duplicate Page Content / Titles Help
Hi guys, My SEOmoz crawl diagnostics throw up thousands of Dup Page Content / Title errors which are mostly from the forum attached to my website. In-particular it's the forum user's profiles that are causing the issue, below is a sample of the URLs that are being penalised: http://www.mywebsite.com/subfolder/myforum/pop_profile.asp?mode=display&id=1308 I thought that by adding - http://www.mywebsite.com/subfolder/myforum/pop_profile.asp to my robots.txt file under 'Ignore' would cause the bots to overlook the thousands of profile pages but the latest SEOmoz crawl still picks them up. My question is, how can I get the bots to ignore these profile pages (they don't contain any useful content) and how much will this be affecting my rankings (bearing in mind I have thousands of errors for dup content and dup page titles). Thanks guys Gareth
Intermediate & Advanced SEO | | gaz33420 -
Subdomains and SEO - Should we redirect to subfolder?
A new client has mainsite.com and a large numer of city specific sub domains i.e. albany.mainsite.com. I think that these subdomains would actually work better as subfolders i.e mainsite.com/albany rather than albany.mainsite.com. The majority of links on the subdomains link to the main site anyway i.e. mainsite.com/contactus rather than albany.mainsite.com/contactus. Having mostly main domain links on a subdomain doesnt seem like clever link architecture to me and maybe even spammy. Im not overly familiar with redirecting subdomains to subfolders. If we go the route of 301'ing subdomains to subfolders any advice/warnings?
Intermediate & Advanced SEO | | AndyMacLean0