Can anyone help me diagnose an indexing/sitemap issue on a large e-commerce site?
-
Hey guys. Wondering if someone can help diagnose a problem for me.
Here's our site: https://www.flagandbanner.com/
We have a fairly large e-commerce site--roughly 23,000 urls according to crawls using both Moz and Screaming Frog. I have created an XML sitemap (using SF) and uploading to Webmaster Tools. WMT is only showing about 2,500 urls indexed. Further, WMT is showing that Google is indexing only about 1/2 (approx. 11,000) of the urls. Finally (to add even more confusion), when doing a site search on Google (site:) it's only showing about 5,400 urls found. The numbers are all over the place!
Here's the robots.txt file:
User-agent: *
Allow: /
Disallow: /aspnet_client/
Disallow: /httperrors/
Disallow: /HTTPErrors/
Disallow: /temp/
Disallow: /test/Disallow: /i_i_email_friend_request
Disallow: /i_i_narrow_your_search
Disallow: /shopping_cart
Disallow: /add_product_to_favorites
Disallow: /email_friend_request
Disallow: /searchformaction
Disallow: /search_keyword
Disallow: /page=
Disallow: /hid=
Disallow: /fab/*Sitemap: https://www.flagandbanner.com/images/sitemap.xml
Anyone have any thoughts as to what our problems are??
Mike
-
A site running ASP should be perfectly fine. I bet you will see substantial increases in a lot of positive metrics by just pairing down that navigation.
-
Thanks so much for your response, Russ.
You're confirming one of the many issues we have identified (too many internal links) but I had not connected it to indexing or site speed. When I use the Google Page Speed Tool, many of our pages are not even registering. It seems like it's taking too long to load them so it times out. Could the crazy amount of links have to do with this, too?
Moreover, our mobile speed is especially poor. This could be an even bigger problem in mobile, no?
Are you familiar with .asp sites, in particular, having indexing issues...or is that a false assumption?
Mike
-
Thanks for the question!
First, it is very common to get inconsistent answers from GSC, site:, sitemap and crawl results. Don't worry too much about that.
Your goal is to get as many of your pages indexed and that is a function of links pointing to your site and internal link structure. While it is an imperfect analogy, we often refer to this as "crawl budget". There are essentially 2 solutions to this...
1. Get more/better backlinks to a diversity of pages on your site.
2. Improve your internal link architecture so that Googlebot finds your pages more quickly.
I think the problem in your case is that the site inundates bots with generic navigational links. For example, this page...
http://www.flagandbanner.com/products/chrome-air-force-lt-general-flag-kit.asp
has 1400 internal links! That is crazy!
This page has 1500!
https://www.flagandbanner.com/products/citizenship-gifts.asp
You need to reel this back in dramatically. Your navigation should like to top level categories or maybe a handful of subcategories. Once in a category, you can reveal deeper categories. This will increase the likelihood that the related and "also" buy links that you find on product pages will get found and followed by Googlebot.
Finally, on a different note, you need to make sure you standardize the casing of URLs (ie: /Products/ or /products/) I noticed that you have links both internal and external that do not take this into account, causing unnecessary duplicate content.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I get Bing to index my subdomain correctly?
Hi guys, My website exists on a subdomain (i.e. https://website.subdomain.com) and is being indexed correctly on all search engines except Bing and Duck Duck Go, which list 'https://www.website.subdomain.com'. Unfortunately my subdomain isn't configured for www (the domain is out of my control), so searchers are seeing a server error when clicking on my homepage in the SERPs. I have verified the site successfully in Bing Webmaster Tools, but it still shows up incorrectly. Does anyone have any advice on how I could fix this issue? Thank you!
Intermediate & Advanced SEO | | cos20300 -
Domain Authority... http://www.domain.com/ vs. http://domain.com vs. http://domain.com/
Hey Guys, Looking at Page Authority for my Site and ranking them in Decending Order, I see these 3 http://www.domain.com/ | Authority 62 http://domain.com | Authority 52 http://domain.com/ | Authority 52 Since the first one listed has the highest Authority, should I be using a 301 redirects on the lower ranking variations (which I understand how works) or should I be using rel="canonical" (which I don't really understand how it works) Also, if this is a problem that I should address, should we see a significant boost if fixed? Thanks ahead of time for anyone who can help a lost sailor who doesn't know how to sail and probably shouldn't have left shore in the first place. Cheers ZP!
Intermediate & Advanced SEO | | Mr_Snack0 -
Doing large scale visual link/content analysis
Hi i currently have a list of about 5000 URLs i want to visually check quickly, to identify decent content. I'm currently opening 200 at a time with firefox, more than 200 it gets really choppy and slow as you would expect. I was wondering if anyone knew any other ways of opening a large amount of web pages. It would be sweet if there was a tool which can scan a list, add the webpages to a pdf/powerpoint and send them back to you for analysis. Kind Regards, Chris
Intermediate & Advanced SEO | | Mikey0080 -
Where is the best place to put a sitemap for a site with local content?
I have a simple site that has cities as subdirectories (so URL is root/cityname). All of my content is localized for the city. My "root" page simply links to other cities. I very specifically want to rank for "topic" pages for each city and I'm trying to figure out where to put the sitemap so Google crawls everything most efficiently. I'm debating the following options, which one is better? Put the sitemap on the footer of "root" and link to all popular pages across cities. The advantage here is obviously that the links are one less click away from root. Put the sitemap on the footer of "city root" (e.g. root/cityname) and include all topics for that city. This is how Yelp does it. The advantage here is that the content is "localized" but the disadvantage is it's further away from the root. Put the sitemap on the footer of "city root" and include all topics across all cities. That way wherever Google comes into the site they'll be close to all topics I want to rank for. Thoughts? Thanks!
Intermediate & Advanced SEO | | jcgoodrich0 -
Is it Wortwhile to have a HTML site map for a Large Site
We are a large, enterprise site with many pages (some on our CMS and some old pages that exist outside our CMS). Every month we submit various an XML site map. Some pages on our site can no longer be found via following links from one page to another (orphan pages). Some of those pages are important and some not. Is it worth our while to create a HTML site map? Does any one have any recent stats or blog posts to share, showing how a HTML site map may have benefited a large site. Many thanks
Intermediate & Advanced SEO | | CeeC-Blogger0 -
E-commerce site structure & link juice: Bouncing off an idea
Hi guys, Question from a new-comer in SEO. Summary of the situation: potential customers are searching for a generic product category (buy mountainbike) more often than a brand in that category (Specialized MTB). And the latter is searched more often than a specific product ('some specific product from Specialized brand'). Both the brand pages and product pages are not ranking good Then would it be a good idea to have the category pages only link to the brand pages? They may show the products, but the links wouldn't pass link juice. I'm not even sure if that is technically possible, but I wanted to figure out the merit first. I'm hoping this would support the brand pages to rank better as they take in more volume. Please do feel free to teach me!
Intermediate & Advanced SEO | | Peter850 -
Do image sitemaps provide value for non e-commerce sites?
Is it worth putting together an image sitemap to submit to Google if you're not an e-commerce site? Also, if you're using a CDN like Amazon Web Services (cloudfront), can you even submit an image sitemap? According to Google you need to verify your CDN in webmaster tools if you're going to do so. https://support.google.com/webmasters/answer/178636?hl=en
Intermediate & Advanced SEO | | kking41201 -
E-Commerce site - How do I geo-target towns/cities/states if there aren't any store locations?
Site = e-commerce Products = clothing (no apparel can be location specific like sports gear where you can do the location specific team gear (NBA, NFL, etc)) Problems = a. no store front b. I don't want to do any sitewides (footers, sidebars, etc) because of the penguin update Question = How do you geo-target these category pages and product pages? Ideas = a. reviews with clients locations b. blog posts with clients images wearing apparel and location description and keywords that also links back to that category or be it product page (images geo- targeted, tags, and description) c. ? Thanks in advance!
Intermediate & Advanced SEO | | Cyclone0