TLDs vs ccTLDs?
-
*Was trying to get this question answered in another thread but someone marked it as "answered" and no more responses came.
So the question is about best practices on TLDs vs ccTLDs. I have a .com TLD that has DA 39 which redirects to the localized ccTLDs .co.id and .com.sg that have DA 17. All link building has been done for the .com TLD. In terms of content, it sometimes overlaps as the same content shows up on both the ccTLDs.
What is best practices here? It doesnt look like my ccTLDs are getting any juice from the TLD. Should I just take my ccTLDs and combine them into my TLD in subdomains? Will I see any benefits?
Thanks
V
-
Thanks, Jane, that's a much better answer/example!
-
Hi again,
Sorry it has taken a few days to get back to you. I replied in the other thread about ccTLDs versus using one site. Some additional info: in general, you will have an easier time using the subfolder structure recommended in the other question (again, as long as there are no factors which make it important to have country-specific domains). The Singapore / Indonesian sections of the website will naturally inherit authority because they sit on the strong .com. Just being linked to by the .com isn't enough to give them such a large boost.
Apple uses this strategy for internationalisation: http://www.apple.com/uk/ for the UK, http://www.apple.com/nz/ for New Zealand, http://www.apple.com/sg/ for Singapore and so forth.
On the other hand, BlackBerry uses subdomains: http://uk.blackberry.com/ and http://sg.blackberry.com/.
Amazon obviously uses ccTLDs.
All of these domains are hellishly strong in their own rights; traditionally it has been thought of as best to use one site like Apple does if you are no a mammoth already. However, you can make other options work with good link development. I think in your case, one domain is something to seriously consider.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexing Of Pages As HTTPS vs HTTP
We recently updated our site to be mobile optimized. As part of the update, we had also planned on adding SSL security to the site. However, we use an iframe on a lot of our site pages from a third party vendor for real estate listings and that iframe was not SSL friendly and the vendor does not have that solution yet. So, those iframes weren't displaying the content. As a result, we had to shift gears and go back to just being http and not the new https that we were hoping for. However, google seems to have indexed a lot of our pages as https and gives a security error to any visitors. The new site was launched about a week ago and there was code in the htaccess file that was pushing to www and https. I have fixed the htaccess file to no longer have https. My questions is will google "reindex" the site once it recognizes the new htaccess commands in the next couple weeks?
Intermediate & Advanced SEO | | vikasnwu1 -
How to Evaluate Original Domain Authority vs. Recent 'HTTPS' Duplicate for Potential Domain Migration?
Hello Everyone, So our site has used ‘http’ for the domain since the start. Everything has been set up for this structure and Google is only indexing these pages. Just recently a second version was created on ‘httpS’. We know having both up is the worst case scenario but now that both are up is it worth just switching over or would the original domain authority warrant just keeping it on ‘http’ and redirecting the ‘httpS’ version? Assuming speed and other elements wouldn’t be an issue and it's done correctly. Our thought was if we could do this quickly it would be easier to just redirect the ‘httpS’ version but was not sure if the Pros of ‘httpS’ would be worth the resources. Any help or insight would be appreciated. Please let us know if there are any further details we could provide that might help. Looking forward to hearing from all of you! Thank you in advance for the help. Best,
Intermediate & Advanced SEO | | Ben-R1 -
YOAST Premium extensions vs Yoast SEO Free?
Considering buying the Yoast Premium extensions (or perhaps whole bundle) but trying to weigh up if its worth it or if the free version is comprehensive enough to do the job? Obviously paid has more features but is it worth the price tag? (I have 4 different websites) If anyone has a paid version of Yoast Premium pack and seen improvements with rankings, mark ups in google etc I would appreciate hearing your story! Thank you in advance
Intermediate & Advanced SEO | | IsaCleanse0 -
Desktop vs. Mobile Website - ranking impact
Working on develop mobile pages using dynamic serving method, we are planing on only develop number of important pages (not the whole site) to be mobile friendly. To keep the consistency of the user experience, the new mobile site will only have internal links to pages that are mobile friend. Questions: If an existing non-mobile page ranking #1 on mobile SERP today, this page will not have a mobile friendly version, and will not link in the mobilefriendly site. will there be any impact to the ranking. Assuming: When Google mobile/Smartphone bots will not see a link to this page. The page will still accessible to Google desktop bots.
Intermediate & Advanced SEO | | tomchu0 -
Microsites: Subdomain vs own domains
I am working on a travel site about a specific region, which includes information about lots of different topics, such as weddings, surfing etc. I was wondering whether its a good idea to register domains for each topic since it would enable me to build backlinks. I would basically keep the design more or less the same and implement a nofollow navigation bar to each microsite. e.g.
Intermediate & Advanced SEO | | kinimod
weddingsbarcelona.com
surfingbarcelona.com or should I rather go with one domain and subfolders: barcelona.com/weddings
barcelona.com/surfing I guess the second option is how I would usually do it but I just wanted to see what are the pros/cons of both options. Many thanks!0 -
New my domain.com/blog option vs. my blog.mydomain.com option
Our e-commerce site has been on Big Commerce for about a year now. One thing many SEO folks had told us is that having a blog located at /blog was going to help more than a subdomain blog. option. BC has never had the option to have a blog hosted on their platform (/blog) until now. I am now wondering, since we have lost traffic in the past and are trying everything we can to regain it, if we should purchase the Wordpress Site Redirect upgrade and move the subdomain blog (blog.) to the new site option /blog. Any help or feedback from you is very much appreciated. I have attached a screenshot of our main website vs. our blog from Open Site Explorer in case it helps anything. I29Tw5P
Intermediate & Advanced SEO | | josh3300 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
Controlling PageRank vs flat site architecture
Hey all. Here's the scenario. I have this pretty trusted site with a relatively high PR. The navigation menu has around 300 links. But this is because it is a CSS menu that drills down into subcategories. Now, would restricting the amount of links in this menu be beneficial? I am not worried about any subcategory pages not being crawled or indexed, but I am concerned that subcategory pages will not receive as high of PageRank if they are not linked to directly from the home page, thereby lowering the ranking potential. Even with new pages that are created they receive a PR of 5 if linked to from the home page. But I'm also thinking that toning down the menu size would be beneficial by funneling more PageRank to category pages and increasing the likelihood of ranking for some core head/middle terms. I have seen sites that externalize the menu in JavaScript files and disallow it in Robots.txt to prevent too much PageRank from linking out, but SEO isn't really a one-solution-fits-all in my experience. I may try a test. Externalizing the menu may also increase the relevance for pages because I won't have a bunch of other content on the page not relevant to that page's specific keywords. Anyone with experience in this arena? I would love to hear your input. Thanks
Intermediate & Advanced SEO | | JeremyNelson580