Why does Google display the home page rather than a page which is better optimised to answer the query?
-
I have a page which (I believe) is well optimised for a specific keyword (URL, title tag, meta description, H1, etc). yet Google chooses to display the home page instead of the page more suited to the search query.
Why is Google doing this and what can I do to stop it?
-
Thanks Lydia. I will expand on the internal links. I'm assuming:
- Links from within body copy (eg. product descriptions) is better than any kind of nav link?
- Anchor text needs to be mixed up and not all 'exact match' text?
Thx
-
It is also possible that the lack in external back links is a part of the issue as well, if there are a significant amount more pointing at the homepage (if it is also relevant at all to the term that you are referencing).
-
Thanks James. Yes, I'm beginning to wonder of it has been over-optimized.
There are no external backlinks, but there are a few internal links and I noticed they use 'exact match' anchor text. The same keyword is used in the URL, title, meta desc and H1. So maybe over- optimization is the issue?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google not displaying meta description
Hi, one of my clients is receiving the following error in SERP - "A description of the page is not available because of this site's robots.txt". The site is built on WordPress and I realized that by default, the settings were checked to blocks bots from crawling the site. So, I turned it off, fixed robots.txt and submitted the sitemap again. Since, then it's been almost 10 days, the problem still exists. Can anyone tell me what should be done to fix it or if there's a way to get Google to recrawl the pages again.
Intermediate & Advanced SEO | | mayanksaxena0 -
Optimum Word Count for Home Page Text
We operate a commercial real estate web site (www.nyc-officespace-leader.com) in New York City. Our home page text is about 500 words. Currently the home page text is of a promotional nature and not very engaging. We are attempting to write a check list for companies that are seeking to lease commercial space and make the text very useful, practical and engaging. However we are having difficulty covering all the bases with less than 1,000 words. If the home page text has 1,000-1,300 words is that detrimental from an SEO point of view? On the plus side I would think this would allow us to include several secondary keyword terms and to add plurals and variations of the two or three top phrases. Any thoughts or suggestions? Thanks, Alan Rosinsky
Intermediate & Advanced SEO | | Kingalan10 -
What's the best way to check Google search results for all pages NOT linking to a domain?
I need to do a bit of link reclamation for some brand terms. From the little bit of searching I've done, there appear to be several thousand pages that meet the criteria, but I can already tell it's going to be impossible or extremely inefficient to save them all manually. Ideally, I need an exported list of all the pages mentioning brand terms not linking to my domain, and then I'll import them into BuzzStream for a link campaign. Anybody have any ideas about how to do that? Thanks! Jon
Intermediate & Advanced SEO | | JonMorrow0 -
Blog home page and SEO
Why do most blog owners not put content that is unique to the home page above the fold before posts begin?
Intermediate & Advanced SEO | | BobAnderson0 -
Google Analytics: how to filter out pages with low bounce rate?
Hello here, I am trying to find out how I can filter out pages in Google Analytics according to their bounce rate. The way I am doing now is the following: 1. I am working inside the Content > Site Content > Landing Pages report 2. Once there, I click the "advanced" link on the right of the filter field. 3. Once there, I define to "include" "Bounce Rate" "Greater than" "0.50" which should show me which pages have a bounce rate higher of 0.50%.... instead I get the following warning on the graph: "Search constraints on metrics can not be applied to this graph" I am afraid I am using the wrong approach... any ideas are very welcome! Thank you in advance.
Intermediate & Advanced SEO | | fablau0 -
Home page sudden drop in rank, but not others.
Over the weekend, I noticed "www.thematstore.com" drop out of the search results for "chair mats", where previously it was between 10-12 ranked. The other pages on the domain have remained the same in regards to rankings for their intended keywords. The only thing I changed (on Nov 29th) is the title tag from "Chair Mats - Commercial Quality Chair Mats, Custom Made", to "Chair Mats - Best Custom Chair Mats, Commercial Quality". I know there is a lot that needs to be done on the site, and the site architecture is a bit of a mess (I recommended building a new site)... but I was hoping to gradually fix these issues without gambling with a rank drop. Not sure why the home page all of the sudden dropped out, any suggestions?
Intermediate & Advanced SEO | | Joes_Ideas0 -
Google consolidating link juice on duplicate content pages
I've observed some strange findings on a website I am diagnosing and it has led me to a possible theory that seems to fly in the face of a lot of thinking: My theory is:
Intermediate & Advanced SEO | | James77
When google see's several duplicate content pages on a website, and decides to just show one version of the page, it at the same time agrigates the link juice pointing to all the duplicate pages, and ranks the 1 duplicate content page it decides to show as if all the link juice pointing to the duplicate versions were pointing to the 1 version. EG
Link X -> Duplicate Page A
Link Y -> Duplicate Page B Google decides Duplicate Page A is the one that is most important and applies the following formula to decide its rank. Link X + Link Y (Minus some dampening factor) -> Page A I came up with the idea after I seem to have reverse engineered this - IE the website I was trying to sort out for a client had this duplicate content, issue, so we decided to put unique content on Page A and Page B (not just one page like this but many). Bizarrely after about a week, all the Page A's dropped in rankings - indicating a possibility that the old link consolidation, may have been re-correctly associated with the two pages, so now Page A would only be getting Link Value X. Has anyone got any test/analysis to support or refute this??0 -
Google Places optimisation for service franchise, 150 franchisees with no physical addresses?
So we have a client who is a plumbing franchise with about 150 franchisees across the country. Because its a plumbing franchise the businesses don't have street addresses (apart from the franchisee home addresses but we don't want to use those) We used to have bulk uploaded listings for the franchise locations and used the GPO address is the suburb/city as the address and got away with this fine for years. Google has copped onto this and asked for reverification of the listings by post now. So my question is what's the best way to optimise places for 150+ locations. As a quick fix, we're going to add a new places location as the master franchise HQ office (address exists). We can then add all the suburbs/areas serviced into this location which may or may not show up for local searches in those areas. We could potentially verify all listings by mail by using private mailboxes but mail verify on a mass scale like that is likely to be flaky not to mention an admin nightmare. Does anyone have an experience with this and how they got around it?
Intermediate & Advanced SEO | | Brendo2