Site Audit: Indexed Pages Issue
-
Over the last couple of months I've been working through some issues with a client. One of my starting points was doing a site Audit. I'm following a post written by Geoff Kenyon https://mza.seotoolninja.com/blog/technical-site-audit-for-2015 .
One of the main issues of the site audit seems to be that when I run a "site:domain.com" query in Google my homepage isn't the first page listed in fact it isn't listed in this search when I go through all of the listings. I understand that it isn't required to have your homepage listed first when running this type of query, but I would prefer it.
Here are some things I've done
- I ran another query "info:homepage.com" and the home page is indexed by Google.
- When I run a branded search for the company name the home page does come up first.
- The current page that is showing up first in the "site:domain.com" listing is my blog index page.
- Several months back I redirected the index.php page to the root of the domain. Not sure if this is helping or hurting.
- In the sitemap I removed the index.php and left only the root domain as the page to index.
- Also all interior links are sent to the root, index.php has been eliminated from all internal links everything links to root
- The main site navigation does not refer to the "Home" page, but instead my logo is the link to the Home page.
- Should I noindex my blog/index.php page? This page is only a compilation of posts and does not have any original content instead it actually throws up duplicate content warnings.
Any help would be much appreciated. I apologize if this is a silly question, but I'm getting frustrated/ annoyed at the whole situation.
-
Thanks Seoman,
That was why I was wondering if I should noindex the blog index page. It is purely a listing of blog entries and not original content. It seems to throw up duplicate content issues and Google seems to give it the most page power on the site even though it is not my most important page.
I would want Google to still follow all of the links because those are the blog posts and the original content. I don't know if the noindex is the best choice but I think it at least it would tell Google "Hey guys the blog page is not my most important page. In fact it is just a compilation of posts"
I haven't pulled the trigger on it yet, because I don't know if it will hurt me more than it is helping. I just don't know. If anyone has any other thoughts on the noindex of the blog index page which is not my home page feel free to drop me a line.
-
Apologies I'd slightly misunderstood your question, I see exactly what you mean now. I think this is purely down to the way Google associates the search intent and tries to deliver the most appropriate result.
The site parameter is obviously intended to help users find a specific item on the specified site, therefore if the blog has more content than the other pages there is more chance that it will have what the user is looking for hence Google will deliver that page out of preference.
Don't know for sure but just an assumption.As you said branded searches are fine, there certainly doesn't look to be any issues as far as I can see although I haven't done a full audit.
Would be interested to see what anyone else says but my gut feeling is there is nothing to be worried about, the main thing is you come up for your company name and search terms that you want.
Sorry hope that helps somewhat.
All the best
-
Feel free to take a look www.denverilluminations.com & www.denverilluminations.com/_blog/ .
Also the domain authority is 19 for the site I was looking at the individual page authorities. Thanks again Seoman.
-
Anyway you could let me have the two links and I can give them a quick look over?
Also bear in mind that DA isn't everything.
-
Seoman,
Thanks for the response. I appreciate any and all suggestions
- Blog page has a page authority of 1 out of 100 the home page has a page authority of 33 out of 100
- I looked at google's cache for pages and reviewed the text only version and everything is showing.
- Checked robots and I'm disallowing certain directories that I don't want indexed or crawled but those are all in order and I tested the robots.txt just to make sure it was written properly and it came back clean.
I don't believe noindexing my blog page is absolutely necessary, but I'm kind of wondering if Google thinks that it is my home page instead of my regular root directory? I know it sounds a little weird but I'm wondering if something is confusing the spiders. Thanks again for your time and thoughts.
-
Few quick thoughts come to mind (in order of priority)
- Blog page may have more authority than the homepage
- Could be a technical issue with the homepage (Maybe Google can't see anything there)
- Check your robots.txt to make sure it's not blocked (Sounds crazy but can happen)
I would strongly advise against noindexing unless it is absolutely necessary.
Personally I wouldn't be too worried about the homepage not showing although, I agree it's a good idea to know why. After all no customers are going to be using Google search parameters like site or info. They are going to be searching for what they want and expecting an answer on the page that Google provides them with.
Not sure if that helps or not but just a few thoughts.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search result page
I need an answer how google sees this page. if somebody searches in carhub.com , normally goes to http://www.carhub.com/Results.aspx?CarState=Used&MakeName=BMW&MakeId=ENKWD0M8TR7W&Location=Los_Angeles but pushes the webpage http://www.carhub.com/Results.aspx , User sees the webpage like these.. but not seen any title, description and h1
Local Website Optimization | | carhub0 -
Company sells home appliances and commercial appliances. What is the best way to differentiate the two on our site for the best user experience/SEO?
Should we structure it starting at the homepage with the user selecting for home or for business, that way they have to make a selection before moving further OR should we somehow differentiate in the navigation using the top menu tabs, dropdowns, etc?
Local Website Optimization | | dkeipper1 -
Can to many 301 redirects damage my Ecommerce Site - SEO Issue
Hello All, I have an eCommerce website doing online hire. We operate from a large number of locations (100 approx) and my 100 or so categories have individual locations pages against them example - Carpet Cleaners (category) www.mysite/hire-carpetcleaners
Local Website Optimization | | PeteC12
carpet cleaner hire Manchester www.mysite/hire-carpetcleaners/Manchester
carpet cleaner hire london
carpet cleaner hire Liverpool patio heater (category)
patio heater hire Manchester
patio heater hire London
patio heater hire Liverpool And so on..... I have unique content for some of these pages but given that my site had 40,000 odd urls, I do have a large amount of thin/duplicate content and it's financially not possible to get unique
content written for every single page for all my locations and categories. Historically, I used to rank very well for these location pages although this year, things have dropped off and recently , I was hit with the Panda 4.0 update which i understand targets thin content. Therefore what I am int he process of doing is reducing the number of locations I want to rank for and have pages for thus allowing me to achieve both a higher percentage of unique content over duplicate/thin content on the whole site and only concerntrate on a handful of locations which I can realistically get unique content written for. My questions are as follows. By reducing the number of locations, my website will currently 301 redirect these location page i have been dropping back to it's parent category.
e.g carpet cleaner hire Liverpool page - Will redirect back to the parent Carpet cleaner hire Page. Given that I have nearly 100 categories to do , this will mean site will generate thousands of 301 redirects when I reduce down to a handful of locations per category. The alternative Is that I can 404 those pages ?... What do yout think I should do ?.. Will it harm me by having so many 301's . It's essentially the same page with a location name in it redirecting back to the parent. Some of these do have unqiue content but most dont ?. My other question is - On a some of these categories with location pages, I currently rank very well for locally although there is no real traffic for these location based keywords (using keyword planner). Shall I bin them or keep them? Lastly , Once I have reduced the number of location pages , I will still have thin content until , I can get the unique content written for them. Should I remove these pages until that point of leave them as it is? It will take a few months
to get all the site with unique content. Once complete, I should be able to reduce my site down from 40,000 odd pages to say 5,000 pages Any advice would be greatly appreciated thanks
Pete0 -
Best marketing for a language learning site
Hello everybody, I'm a programmer so I'm not very good at marketing. Any idea what the best way is to promote my language learning site? (http://www.antosch-and-lin.com/) Since Google Penguin the site has taken a big hit and the changes suggested by a SEO expert hasn't helped. Thanks for any suggestions!
Local Website Optimization | | delpino0 -
Do more page links work against a Google SEO ranking when there is only 1 url that other sites will link to?
Say I have a coupon site in a major city and assume there are 20 main locations regions (suburb cities) in that city. Assume that all external links to my site will be to only the home page. www.site.com Assume also that my website business has no physical location. Which scenario is better? 1. One home page that serves up dynamic results based on the user cookie location, but mentions all 20 locations in the content. Google indexes 1 page only, and all external links are to it. 2. One home page that redirects to the user region (one of 20 pages), and therefore will have 20 pages--one for each region that is optimized for that region. Google indexes 20 pages and there will be internal links to the other 19 pages, BUT all external links are still only to the main home page. Thanks.
Local Website Optimization | | couponguy0 -
Which is better for Local & National coupons --1000s of Indexed Pages per City or only a Few?
Not sure where this belongs.. I am developing a coupons site for listing local coupons and national coupons (think Valpak+RetailMeNot), eventually in all major cities, and am VERY concerned about how many internal pages to let google 'follow' for indexing, as it can exceed 10,000 per city. Is there a way to determine what the optimal approach is for internal paging/indexing BEFORE I actually launch the site (it is about ready except for this darned url question, which seems critical) Ie can I put in searchwords for google to determine which ones are most worthy to have their own indexed page? I'm a newbie sort of, so please put answer in simple terms. I'm one person and have limited funds and need to find the cheapest way to get the best organic results for each city that I cover. Is there a generic answer? One SEO firm told me the more variety the better. Another told me that simple is better, and use content on the simple pages to get variety. So confused I decided to consult the experts here! Here's the site concept: **FOR EACH CITY: ** User inputs location: Main city only(ie Houston), or 1 of 40 city regions(suburb, etc..), or zip code, or zip-street combo, OR allow gps lookup. A miles range is defaulted or chosen by the user. After search area is determined, user chooses 1 of 6 types of coupons searches: 1. Online shopping with national coupon codes, choice of 16 categories (electronics, health, clothes, etc) and 100 subcategories (computers, skin care products, mens shirts) These are national offers for chains like Kohls, which do not use the users location at all. 2. Local shopping in-store coupons, choice of same 16 categories and 100 subcategories that are used for online shopping in #1 (mom & pop shoe store or local chain offer). The results will be within the users chosen location and range. 3. Local restaurant coupons, about 60 subcategories (pizza, fast food, sandwiches). The results are again within the users chosen location and range. 4. Local services coupons, 8 categories (auto repair, activities,etc..) and around 200 subcategories (brakes, miniature golf, etc..). Results within users chosen location and range. 5. Local groceries. This is one page for the main city with coupons.com grocery coupons, and listing the main grocery stores in the city. This page does not break down by sub regions, or zip, etc.. 6. Local weekly ad circulars. This is one page for the main city that displays about 50 main national stores that are located in that main city. So, the best way to handle the urls indexed for the dynamic searches by locations, type of coupon, categories/subcats, and business pages The combinations of potential urls to index are nearly unlimited: Does the user's location matter when he searches for one thing (restaurants), but not for another (Kohls)? IF so, how do I know this? SHould I tailor indexed urls to that knowledge? Is there an advantage to having a url for NATIONAL cos that ties to each main city: shopping/Kohls vs shopping/Kohls/Houston or even shopping/Kohls/Houston-suburb? Again, I"m talking about 'follow' links for indexing. I realize I can have google index just a few main categories and subcats and not the others, or a few city regions but not all of them, etc.. while actually having internal pages for all of them.. Is it better to have 10,000 urls for say coupon-type/city-region/subcategory or just one for the main city: main-city/all coupons?, or something in between? You get the gist. I don't know how to begin to figure out the answers to these kinds of questions and yet they seem critical to the design of the site. The competition: sites like Valpak, MoneyMailer, localsaver seem to favor the 'more is better' approach, with coupons/zipcode/category or coupons/bizname/zipcode But a site like 8coupons.com appears to have no indexing for categories or subcategories at all! They have city-subregion/coupons and they have individual businesses bizname/city-subregion but as far as I see no city/category or city-subregion/category. And a very popular coupons site in my city only has maincity/coupons maincity/a few categories and maincity/bizname/coupons. Sorry this is so long, but it seems very complicated to me and I wanted to make the issue as clear as possible. Thanks, couponguy
Local Website Optimization | | couponguy1 -
Launching Hundreds of Local Pages At Once or Tiered? If Tiered, In What Intervals Would You Recommend?
Greeting Mozzers, This is a long question, so please bare with me 🙂 We are an IT and management training company that offers over 180 courses on a wide array of topics. We have multiple methods that our students can attend these courses, either in person or remotely via a technology called AnyWare. We've also opened AnyWare centers in which you can physically go a particular location near you, and log into a LIVE course that might be hosted in say, New York, even if you're in say, LA. You get all the in class benefits and interaction with all the students and the instructor as if you're in the classroom. Recently, we've opened 43 AnyWare centers giving way to excellent localization search opportunities to our website (e.g. think sharepoint training in new york or "whatever city we are located in). Each location has a physical address, phone number, and employee working there so we pass those standards for existence on Google Places (which I've set up). So, why all this background? Well, we'd like to start getting as much visibility for queries that follow the format of "course topic area that we offered" followed by "city we offer it in." We offer 22 course topic areas and, as I mentioned, 43 locations across the US. Our IS team has created custom pages for each city and course topic area using a UI. I won't get into detailed specifics, but doing some simple math (22 topic areas multiplied by 43 location) we get over 800 new pages that need to eventually be crawled and added to our site. As a test, we launched the pages 3 months ago for DC and New York and have experienced great increases in visibility. For example, here are the two pages for SharePoint training in DC and NY (total of 44 local pages live right now). http://www2.learningtree.com/htfu/usdc01/washington/sharepoint-training
Local Website Optimization | | CSawatzky
http://www2.learningtree.com/htfu/usny27/new-york/sharepoint-training So, now that we've seen the desired results, my next question is, how do we launch the rest of the hundreds of pages in a "white hat" manner? I'm a big fan of white hat techniques and not pissing off Google. Given the degree of the project, we also did our best to make the content unique as possible. Yes there are many similarities but courses do differ as well as addresses from location to location. After watching Matt Cutt's video here: http://searchengineland.com/google-adding-too-many-pages-too-quickly-may-flag-a-site-to-be-reviewed-manually-156058 about adding too man pages at once, I'd prefer to proceed cautiously, even if the example he uses in the video has to do with tens of thousands to hundreds of thousands of pages. We truly aim to deliver the right content to those searching in their area, so I aim no black hat about it 🙂 But, still don't want to be reviewed manually lol. So, in what interval should we launch the remaining pages in a quick manner to raise any red flags? For example, should we launch 2 cities a week? 4 cities a month? I'm assuming the slower the better of course, but I have some antsy managers I'm accountable to and even with this type of warning and research, I need to proceed somehow the right way. Thanks again and sorry for the detailed message!0 -
Site does not rank on Google's country specific search engines.
My site shows up on the first page of 'google.com' but not on the other search engines like google.co.uk / google.co.in / google.com.au. It shows up on the 3rd or 4th page for the most part. My competitors' sites rank consistently across all geographical versions of Google. Is there something i am missing out on? My website is a web applicaton and not a business listing.
Local Website Optimization | | dlsound0