Silo Architecture across multidomains
-
I am doing a test and would like to ask you're opinion about it in a SEO stand point :
I would like to structure a website that has 3 menu links to 3 keyword rich domains names that would be structure in a Silo architecture each one related to it's own topic and not duplicating the content.
Like a Cross-Domain siloing approach.
Do you think it would work ? How should I build this in order to ultimately build ranking for the main site ? Do you know if I could get this approach working in a global way : Each Second level domain working for itself and propelling the main domain ?
Any article, advises, Graphic, documentation, comment is welcomed !
-
I had a competitor that used to make sites like this. The top navigation would have links to other keyword-rich domains. All of these domains would have identical design so if a visitor was not paying close attention to the URL he would think that he was still on the same site.
This guy had several of these little site clusters.
They all disappeared from Google.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Silo Structure in the eye of google?
does silo structure has a positive point on Google Ranking or not, and what is the importance of internal linking, how google see the internal linking content as compared to less internal linking, I'm trying an experiment I do a lot of internal backlinking in Website Unionwell as compared to Website B (which has apparently less internal Links) so with your experience in SEO field which site will get traffic rapidly.
Intermediate & Advanced SEO | | saimkhanna0 -
SEO effect of content duplication across hub of sites
Hello, I have a question about a website I have been asked to work on. It is for a real estate company which is part of a larger company. Along with several other (rival) companies it has a website of property listings which receives a feed of properties from a central hub site - so lots of potential for page, title and meta content duplication (if if isn't already occuring) across the whole network of sites. In early investigation I don't see any of these sites ranking very well at all in Google for expected search phrases. Before I start working on things that might improve their rankings, I wanted to ask some questions from you guys: 1. How would such duplication (if it is occuring) effect the SEO rankings of such sites individually, or the whole network/hub collectively? 2. Is it possible to tell if such a site has been "burnt" for SEO purposes, especially if or from any duplication? 3. If such a site or the network has been totally burnt, are there any approaches or remedies that can be made to improve the site's SEO rankings significantly, or is the only/best option to start again from scratch with a brand new site, ensuring the use of new meta descriptions and unique content? Thanks in advance, Graham
Intermediate & Advanced SEO | | gmwhite9991 -
Automated Quality Content Acceptable Even Though Looks Similar Across Pages
I have some advanced statistics modules implemented on my website, which is very high level added value for users. However, wording is similar across 1000+ pages, with difference being the statistical findings.
Intermediate & Advanced SEO | | khi5
Page Ex 1: http://www.honoluluhi5.com/oahu/honolulu-condos/
Page Ex: 2: http://www.honoluluhi5.com/oahu/honolulu/metro/waikiki-condos/ As you can see same wording is used "Median Sales Price per Year", "$ Volume of Active Listings" etc etc....difference being the findings / results are obviously different. Questions: are search engines smart enough to realize the quality in this or do they see similar wording across 1000+ pages and p-otentially consider the pages low-quality content, because search engines are unable to identify the high level added value and complexity in pulling such quality data? If that may be the case, does that mean I ought to make the pages more "unique" by including a little piece of writing about each page to make them look more unique, even though it is not of value to users?0 -
Can SPA (single page architecture) websites be SEO friendly?
What is the latest consensus on SPA web design architecture and SEO friendliness?
Intermediate & Advanced SEO | | Robo342
By SPA, I mean rather than each page having its own unique URL, instead each page would have an anchor added to a single URL. For example: Before SPA: website.com/home/green.html After SPA: website.com/home.html#green (rendering a new page using AJAX) It would seem that Google may have trouble differentiating pages with unique anchors vs unique URLs, but have they adapted to this style of architecture yet? Are there any best practices around this? Some developers are moving to SPA as the state of the art in architecture (e.g., see this thread: http://www.linkedin.com/groups/Google-crawling-websites-built-using-121615.S.219120193), and yet there may be a conflict between SPA and SEO. Any thoughts or black and white answers? Thanks.0 -
Where is the point of diminishing returns for silos and keyword subdirectories?
And as a follow-up, is that point far enough out to justify making a site's folder structure different from its navigation structure? I'll give an example. Say I was to do SEO for a hypothetical (I hope) someconstructioncompany.com, and the the menus/submenus were laid out as About Us ---- Our company ---- Our staff ---- Locations -------- Albany (default path would be .com/about-us/locations/albany-ny.html) -------- Miami -------- Liverpool Services ---- Kitchen remodeling (default path would be .com/services/kitchen-remodeling.html) ---- Above ground pools ---- Green building Photo galleries ---- Kitchen photos (default path would be .com/photo-gallery/kitchen-photos.html) ---- Pool photos ---- Green building photos Would there be any benefit (and if so, enough of a benefit to outweigh the additional overhead of keeping track of a separate structure) to having the menus set that way, but the actual files siloed as stuff like someconstructioncompany.com/kitchen-remodeling/kitchen-renovation-services.html someconstructioncompany.com/kitchen-remodeling/custom-kitchen-photo-gallery.html someconstructioncompany.com/above-ground-pools/above-ground-pool-photos.html someconstructioncompany.com/albany-ny/green-building-custom-home-remodeling-contractor-albany.html Would that separation of navigation structure and file structure be beneficial or would that time/effort setting it up be better spent elsewhere? Thanks!
Intermediate & Advanced SEO | | BrianAlpert780 -
There's a website I'm working with that has a .php extension. All the pages do. What's the best practice to remove the .php extension across all pages?
Client wishes to drop the .php extension on all their pages (they've got around 2k pages). I assured them that wasn't necessary. However, in the event that I do end up doing this what's the best practices way (and easiest way) to do this? This is also a WordPress site. Thanks.
Intermediate & Advanced SEO | | digisavvy0 -
In order to improve SEO with silos'urls, should i move my posts from blog directory to pages'directories ?
Now, my website is like this: myurl.com/blog/category1/mypost.html myurl.com/category1/mypage.html So I use silos urls. I'd like to improve my ranking a little bit more. Is it better to change my urls like this: myurl.com/category1/blog/mypost.html or maybe myurl.com/category1/mypost.html myurl.com/category1/mypage.html Thanks
Intermediate & Advanced SEO | | Max840 -
Link Architecture - Xenu Link Sleuth Vs Manual Observation Confusion
Hi, I have been asked to complete some SEO contracting work for an e-commerce store. The Navigation looked a bit unclean so I decided to investigate it first. a) Manual Observation Within the catalogue view, I loaded up the page source and hit Ctrl-F and searched "href", turns out there's 750 odd links on this page, and most of the other sub catalogue and product pages also have about 750 links. Ouch! My SEO knowledge is telling me this is non-optimal. b) Link Sleuth I crawled the site with Xenu Link Sleuth and found 10,000+ pages. I exported into Open Calc and ran a pivot table to 'count' the number of pages per 'site level'. The results looked like this - Level Pages 0 1 1 42 2 860 3 3268 Now this looks more like a pyramid. I think is is because Link Sleuth can only read 1 'layer' of the Nav bar at a time - it doesnt 'hover' and read the rest of the nav bar (like what can be found by searching for "href" on the page source). Question: How are search spiders going to read the site? Like in (1) or in (2). Thankyou!
Intermediate & Advanced SEO | | DigitalLeaf0