Do more page links work against a Google SEO ranking when there is only 1 url that other sites will link to?
-
Say I have a coupon site in a major city and assume there are 20 main locations regions (suburb cities) in that city.
Assume that all external links to my site will be to only the home page. www.site.com Assume also that my website business has no physical location.
Which scenario is better?
1. One home page that serves up dynamic results based on the user cookie location, but mentions all 20 locations in the content. Google indexes 1 page only, and all external links are to it.
2. One home page that redirects to the user region (one of 20 pages), and therefore will have 20 pages--one for each region that is optimized for that region. Google indexes 20 pages and there will be internal links to the other 19 pages, BUT all external links are still only to the main home page.
Thanks.
-
Thanks Marc. Sorry for the slow response--I came down with a bug last night..
Here is the basis for my comments that I am thinking that link juice is about Page Rank and not some much with the resulting Search Rank, as well as that Page Rank may actually not be a big deal anymore with overall Search Rank--and so my concern about dilution is overblown. Regardless of the truth on the matter, I appreciate your advice about content and relevancy:
http://blog.hubspot.com/blog/tabid/6307/bid/5535/Why-Google-Page-Rank-is-Now-Irrelevant.aspx
Thanks again, Ted
-
take a look at the screenshot - it`s taken from this url:
http://moz.com/search-ranking-factors
So you only misunderstood that linkjuice has nothing to do with search rank... it is a ranking factor so you should think about how you can use it more effective. On the other hand, websites with only a few sites or even with less content will also have their very own problems with the rankings. BUT everybody (including Google I guess) would prefer a smaller site if it provides good content in comparison to big sites with nonsense
If you are able and can ensure a certain kind of quality and uniqueness for every single (sub)page of your site then go ahead and use this scenario... if you are just able to create (partial) DC: hands off!
-
HI Marc,
Yeah I may not be explaining my understanding correctly, or I may not understand correctly. What I have read is that the issue of link juice is only connected to page rank and not search rank. So, if I have no backlinks to my subpages, then I don't lose any home page juice. So why even have subpages if no backlinks? Because of the search rank. Queries can still lead people to my subpages. In fact I've read that page rank is hardly even a factor in search rank anymore, which implies that no one should even be concerned about link juice dilution at all! I'd like to believe it because I potentially will have plenty of pages with unique content and would like to build backlinks to at least some of them besides the home page..
Does it sound like I've misunderstood this issue?
-
Maybe I didn
t understand you correctly but to avoid mistakes, please have a look at the attached graphic (linkjuice)... it would be like I
ve explained... I mean its not really bad to add several subpages and to pass some of your whole linkjuice towards them but there is no real advantage in the first place... let
s say that you want to do a really, really good job, then you have to create absolutely unique subpages (20 times in your case) for more or less the same topic... terrific if you can do so... then use the subpage model...It
s not an indisputable fact that your site won
t rank if its just one site... chances might raise if you have additinal subpages but only if you are able to fill each page with unique cotent. I think that there is a potential risk, that you just create DC or partial DC and pass some of your linkjuice towards those unperfect subpages... so if you think that you are able to create 20 unique subpages that choose this scenario... if it
s more or less a copy of the main site than this wouldn´t make any sense -
Hi Marc,
Thank you..I've heard this but here is why I find this issue so perplexing: First, I have read that the link juice is ONLY associated with inbound links, so if in both scenarios above all inbound links are to the home page only, then there is no decrease in link juice if I have 20 internal pages, YET I get the benefit of having 20 more pages indexed that might show up in a user query. I guess I'm trying to confirm that my understanding is correct before I have the programmer (me) set up 20 internal pages...I don't want to any more lose link juice from the home page than I have to.
Yesterday the SEO guy I'm thinking of hiring wrote this:
"If you only have the home page indexed, you will never rank. If you only have incoming links to the home page, you will never rank." I don't really understand this..it is in the context of a coupon site that offers coupons for all regions in all cities and of course they will be categorized by some 30 categories and 200 subcategories...
Any further input..really do appreciate it..
-
linkjuice/linkpower is a term which comes up within the scenarios you describe.
You have to imagine that every single external link gives your site this linkpower/linkjuice. According to that keeping it within one site would be a better decision. If your main site has serveral additional local sites behind/under it, the linkjuice will be passed to those sites.
You don`t have to be a genius in mathematics to see that this would decrease the linkjuice by 20 (in the scenario you describe)...
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sudden drop across all rankings
Hi There, I have experienced a strange and sudden drop in rankings for main keywords for a website that has not been penalized (that I am aware of), has a great site speed, does not contain duplicate content and has not had a sudden drop in links. I can't explain this sudden drop in rankings. The site was ranked between #1 - #6 across: Custom Mouthguards Brisbane, Dentures Brisbane, Denture Implants Brisbane, and Brisbane Denture Clinic to name a few. https://denturehaus.com.au/ Not only was this a drop in regular SERPs but also local Google Map listings, where the website was previously #1 - #3, and now never comes up. I can't figure out what has gone wrong with the website to completely lose all rankings. Any troubleshooting or help would be greatly appreciated! Thanks,
Local Website Optimization | | Kim_Lazaro
Kim0 -
What is the SEO effect of schema subtype deprecation? Do I really have to update the subtype if there isn't a suitable alternative?
Could someone please elaborate on the SEO effect of schema subtype deprecation? Does it even matter? The Local business properties section of developers.google.com says to: Define each local business location as a LocalBusiness type. Use the most specific LocalBusiness sub-type possible; for example, Restaurant, DaySpa, HealthClub, and so on. Unfortunately, the ProfessionalService page of schema.org states that ProfessionalService has been deprecated and many of my clients don't fit anywhere else (or if they do it's not a LocalBusiness subtype). I find it inconvenient to have to modify my different clients' JSON-LD from LocalBusiness to ProfessionalService back to LocalBusiness. I'm not saying this happens every day but how does one keep up with it all? I'm really trying to take advantage of the numerous types, attributes, etc., in structured data but I feel the more I implement, the harder it will be to update later (true of many things, of course). I do feel this is important and that a better workflow could be the answer. If you have something that works for you, please let us know. If you think it's not important tell us why not? (Why Google is wrong) I understand there is always a better use of our time, but I'd like to limit the discussion to solving this Google/Schema.org deprecation issue specifically.
Local Website Optimization | | bulletproofsearch0 -
Website ranking issues
Hi Moz, I have a question about one of our websites that has been ranking very poorly on it's current domain (fancydoorsedmonton.com) lately, but was ranked at #1 for the search term "Edmonton Doors" until last month. The main search terms we're targeting are "Edmonton Doors" and "Doors Edmonton". I made another post regarding the on-page SEO value and had some feedback from that, but there is another issue that seems more likely to cause an issue. There are 2 more domains set up to forward to their main domain: fancydoors.com was their old domain but was registered by someone else and had some questionable, X-rated content put on it. The domain has now been reacquired and redirected to their main domain. There isn't any more questionable content on there anymore. Would this domain's past affect it's current ranking? fancy-doors.com was another old domain of theirs now set up as a redirect. In the past they had another SEO provider work with this domain and did some bad SEO work for them with automated citations, etc. We changed the domain to fancydoorsedmonton.com to get away from that and also include Edmonton in the domain. If you have any ideas or feedback to provide based on this information it would definitely be a huge help to us. Thanks!
Local Website Optimization | | Web3Marketing870 -
Landing page, or redirect? Looking for feedback.
If we have a section of our site that we have branded separately from the rest of the site, does it make sense to provide a landing page on our current, high authority site that has content and links off to the separate site, or would just a domain.com/keyword redirect to the page be a better route? Does it matter? I have an idea, but I'd like to get feedback on this. We are a newspaper, http://billingsgazette.com and we have an auto branded site called http://montanawheelsforyou.com. The URL and branding is fubar. We're wondering if we can increase the ranking if we swapped out the http://billingsgazette.com/autos from a redirect to http://montanawheelsforyou.com to a landing page with content and a link to http://montanawheelsforyou.com.
Local Website Optimization | | rachaelpracht0 -
Virtual Offices & Google Search
United Kingdom We have a client who works from home and wants a virtual office so his clients do not know where he lives. Can a virtual office address be used on his business website pages & contact pages, in title tags and descriptions as well as Google places. The virtual office is manned at all times and phone calls will be directed to the client, the virtual office company say effectively it is a registered business address. Look forward to any helpful responses.
Local Website Optimization | | ChristinaRadisic0 -
Local site went from dominating first page - bad plugin caused duplicate content issues - now to 2nd page for all!
I had a bad plugin create duplicate content issues on my Wordpress CMS - www.pmaaustin.com I got it fixed, but now every keyword has been stuck on page 2 for search terms for 4 months now, where I was 49 out of 52 keywords on page one. It's a small local niche with mostly easier to rank keywords. Am I missing something? p.s. Also has a notice on the Dashboard that says: "404 Redirected: There are 889 captured 404 URLs that need to be processed." Could that be a problem? Thanks, Steve
Local Website Optimization | | OhYeahSteve0 -
Is this an example of bad doorway pages or perfectly fine and helping users?
I'm asking because I want to do something similar. http://bit.ly/1puGXJu Imagine hundreds of pages like this, with the city names switched out. Since the inventory is different on each page, due to different inventory in different cities, are these pages not considered doorway pages and Google will probably be fine with them?
Local Website Optimization | | CFSSEO0 -
International Site Geolocation Redirection (best way to redirect and allow Google bots to index sites)
I have a client that has an international website. The website currently has IP detection and redirects you to the subdomain for your country. They have currently only launched the Australian website and are not yet open to the rest of the world: https://au.domain.com/ Google is not indexing the Australian website or pages, instead I believe that the bots are being blocked by the IP redirection every time they try to visit one of the Australian pages. Therefore only the US 'coming soon' page is being properly indexed. So, I would like to know the best way to place a geolocation redirection without creating a splash page to select location? User friendliness is most important (so we don't want cookies etc). I have seen this great Whiteboard Friday video on Where to Host and How to Target, which makes sense, but what it doesn't tell me is exactly the best method for redirection except at about 10:20 where it tells me what I'm doing is incorrect. I have also read a number of other posts on IP redirection, but none tell me the best method, and some are a little different examples... I need for US visitors to see the US coming soon page and for Google to index the Australian website. I have seen a lot about JS redirects, IP redirects and .htaccess redirects, but unfortunately my technical knowledge of how these affect Google's bots doesn't really help. Appreciate your answers. Cheers, Lincoln
Local Website Optimization | | LincolnSmith0