How to Resolve Google Crawling Issues for My eCommerce Website?
-
I want to resolve Google crawling issues for my eCommerce website. My website is as follow.
http://www.vistastores.com/ Google have crawled only 97 webpages from my website. My website is quite old. (~More than 6 months) But, Google have indexed only 97 webpages.
I have created one campaign over SEOmoz tool and found some errors over there. So, I just assumed that due to it Google did not crawled my website.
But, I have created one another campaign for my competitor website to know actual status and reason behind it.
I found that, my competitor website have more error compare to me but, Google have crawled maximum pages compare to me.
So, What is reason behind it? How can I improve my crawling rate and index maximum webpages to Google?
[6133009604_af85d29730_b.jpg](img src=) 6133009604_af85d29730_b.jpg 6133009604_af85d29730_b.jpg 6139706697_4e252fdb82_b.jpg
-
I aware about it. But, I have confusion with omitted results. Google shows me 509 URLs in visible portion and remaining one in omitted results.
I have checked similar result for my competitor with table lamps keyword.
site:simplytablelamps.com
Google shows 2470 pages for my competitor and omitted result is very less compare to my website.
I really don't know more about it but it may cut out my impression with products which are included in omitted result. This is my assumption. What you think about it? If you can give me more idea so it will help me more.
-
When I do a site:vistastores.com, I see about 2300 results indexed in Google, which indicates Google is both indexing and crawling your site. How many pages do you have?
-
Hi, Liam
I come back on this question after long time. Because, I am still surviving with crawling issue. Google is not crawling my website after implement all checklists.
Today, I read one blog post about to increase Google crawl rate.
Blog suggesting to set custom crawl rate with help of Google webmaster tools. So, Does it really matter to improve crawling?
What is important and helpful ... natural crawling by Google or embarrassing crawling by Google?
-
I am going forward with multiple XML sitemap after long R & D.
-
I am still waiting for additional answers in same direction. Can any one help me? I need answer on urgent basis.
-
Yes, your suggestion is right. I have added XML sitemap to Google webmaster tools. But, I am still confuse with crawling.
-
Not sure if you've already done this, but if not - signup to Google Webmaster Tools and submit your sitemap. You can also check crawl errors etc.
With e-commerce websites it's always useful to use an automated XML sitemap so that each product page is added and can then be crawled and indexed.
Let me know if this answers your question??
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website Not Indexing
My website is not indexing due to someone's complaint in Google that was registered in 2013. The complaint says I am using copyrighted images of someone else on my website but those images were immediately removed by me. I just came to know about this complaint and lodged my problem in Lumen database. Already 14 days has passed but still my page is not indexed. Can anyone please help me?
On-Page Optimization | | varun18000 -
H1 tags are ok for my website?
Please review the H1 tags for my website and let me know if they are ok or I need to change them? Website - brandstenmedia.com.au
On-Page Optimization | | Green.landon0 -
Moz Crawl Shows Duplicate Content Which Doesn't Seem To Appear In Google?
Morning All, First post, be gentle! So I had Moz crawl our website with 2500 high priority issues of duplicate content, not good. However if I just do a simple site:www.myurl.com in Google, I cannot see these duplicate pages....very odd. Here is an example....
On-Page Optimization | | scottiedog
http://goo.gl/GXTE0I
http://goo.gl/dcAqdU So the same page has a different URL, Moz brings this up as an issue, I would agree with that. However if I google both URL's in Google, they will both bring up the same page but with the original URL of http://goo.gl/zDzI7j ...in other words, two different URL's bring up the same indexed page in Google....weird I thought about using a wildcard in the robots.txt to disallow these duplicate pages with poor URL's....something like.... Disallow: /*display.php?product_id However, I read various posts that it might not help our issues? Don't want to make things worse. On another note, my colleague paid for a "SEO service" and they just dumped 1000's of back-links to our website, of course that's come back to bite us in the behind. Anyone have any recommendations for a good service to remove these back-links? Thanks in advance!!0 -
Canonical Tag for Ecommerce Site
My client has an ecommerce site with over 1,000 products. We have a ton of duplicates because of how their ecommerce system handles product pages. Each time a new product is added, there is a default product page created (/product/12345-product-name.aspx). Each time that product is added to a specific product category, another, separate URL is created (/product/office-chairs/12345-product-name.aspx). The site has over 1,000 duplicates (at least one for each product) because of how the ecommerce system structures URLs. We are unable to have unique content on /products/12345-product-name.aspx and /product/office-chairs/12345-product-name.aspx because both pages pull from the same database. Their webteam informed me that they can't implement canonical tags on individual pages, they must be dynamically added to the site all at once. Thus forcing me to choose all of the default product pages as primary URLs. Both types of URLs are getting indexed and the product URLs that were added to the categories are SEO friendly so I'm leary to eliminate one or the other with a canonical tag or a no index. Suggestions?
On-Page Optimization | | DynoSaur0 -
How to make google not index quotes from other sites?
Hey guys, I have a site where we post quite a lot of info from other sites. We don't want google to de-index our pages because parts of it are quotes from other sites. What would you use to make it so Google sees it's a quote from another site? Or to just make Google not index the quote? Thanks!
On-Page Optimization | | StefanJDorresteijn0 -
Can Other Websites You Own Affect Your Rankings?
Let's say you own three websites. One is low quality and Google frowns upon it, another is moderate and a third has stellar unique content. Would Google penalize the third website because you own the first website? On a related note - If you were banned from Google AdWords, would registering a site in your name potentially harm your rankings?
On-Page Optimization | | DerekP0 -
Should a crawl ever take more than 7 days?
I signed up to the 30 day trial last Saturday, however, as of yet, the crawl diagnostics page still says 'First crawl in progress'. Is this normal? Many thanks.
On-Page Optimization | | danzspas0 -
Filtered Navigation, Duplicate content issue on an Ecommerce Website
I have navigation that allows for multiple levels of filtering. What is the best way to prevent the search engine from seeing this duplicate content? Is it a big deal nowadays? I've read many articles and I'm not entirely clear on the solution. For example. You have a page that lists 12 products out of 100: companyname.com/productcategory/page1.htm And then you filter these products: companyname.com/productcategory/filters/page1.htm The filtered page may or may not contain items from the original page, but does contain items that are in the unfiltered navigation pages. How do you help the search engine determine where it should crawl and index the page that contains these products? I can't use rel=canonical, because the exact set of products on the filtered page may not be on any other unfiltered pages. What about robots.txt to block all the filtered pages? Will that also stop pagerank from flowing? What about the meta noindex tag on the filitered pages? I have also considered removing filters entirely, but I'm not sure if sacrificing usability is worth it in order to remove duplicate content. I've read a bunch of blogs and articles, seen the whiteboard special on faceted navigation, but I'm still not clear on how to deal with this issue.
On-Page Optimization | | 13375auc30