Big problem with my new crawl report
-
I am owner of small opencart online store. I installed http://www.opencart.com/index.php?route=extension/extension/info&extension_id=6182&filter_search=seo. Today my new crawl report is awful. The number of errors is up by 520 (30 before), up with 1000 (120 before), notices up with 8000 (1000 before). I noticed that the problem is with search. There is a lot duplicate content in search only. What to do ?
-
Thank you again Alan.
Typo fixed.
-
I use Bing search API,
By the way, you want to change from GET to POST, not the other way around.
-
Alan,
Thank you for the great advice. If one has enough control over the eCommerce system, or the internal site search product, to change from GET to POST so these pages act more like real dynamically generated "search pages" than an infinite amount of "landing pages" I think that is a fantastic solution. It would keep merchandisers and others from linking to those pages - because we all know that they will continue to do it even if the SEO pleads on hands and knees for them to stop.
However, I have found it to be the case that most eCommerce businesses (from small mom-n-pop shops to fortune 500 companies) do not have the ability to do this because the internal site search functionality they use is out of their hands. Site search vendors like Endeca and Celebros serving enterprise eCommerce businesses don't typically hand over the keys to the client.
If you know any site search vendors or solutions that allow one to do this it would make a great contribution to this thread if you could share a few of them. I'd definitely look into recommending them in the future!
Thanks again!
-
The problem with PR leaks is that they are scalable, If you are losing 10%, then you get some quality links, 10% of them will be wasted, every effort you do in the future will be discounted by 10%.
There are ways to fix all these problems, for example I would make a search to be POST and not GET so that links to search pages can not be made and therefor search pages will not get indexed.
We work so hard to get good links, why waste them when you do?
-
I have tried different methods to fix this. First-hand experience tells me that oftentimes it is better to just block the paths (assuming there is better navigation on the site) from being crawled or indexed using robots.txt than to use a noindex,follow tag in order to save the pagerank you're sending via internal links. It is very easy for Google to get bogged down crawling around in the internal search results area.
Unless there are lots of links to search pages from top pages on the site, or a big list of search page links from every page (sitewide footer, for example) I really don't think the waste of internal pagerank is noticeable in the rankings, or worth salvaging if it risks sending spiders into a maze or a trap.
Yes, best practice is not to link to pages that you are blocking. In the real world though, search pages can be very useful to visitors, and to merchandisers who don't have the ability to create more targeted sub-sub-sub categories will often use them, and link to them on the site, as landing pages for promotional purposes (emails, PPC, sales...).
Everyone has their own strategies, and all we can do is make recommendations based on our own experience and knowledge. Thanks for helping out with this question Alan. Feel free to elaborate so Anastas has more input to help guide his decision.
-
as long as no one is linking to the search pages including internal links.
-
Hello Anastas,
I agree that you should block the search folder from being indexed. I'm going to assume that nobody is linking to your search pages and that you have other paths (e.g. SEO-friendly navigation, sitemaps...) for search engines to use to access your products).
I don't understand why you have formatted the disallow statement that way, however. Unless I'm missing something (and could be since I don't know what your site is) you only need to do this:
Disallow: /product/search*
And of course after doing this you should test it in GWT to make sure that A: You are blocking the pages you want to block, such as search pages with lots of parameters, and B: You are NOT blocking other pages you don't want to block, such as product pages. Here is more info on where to find the testing tool in GWT if you don't know: http://productforums.google.com/forum/#!topic/webmasters/tbikAxJiIZ4
Let us know how it goes. Good luck.
-
Please I need help
-
I am using opencart. I dont know what to do. Before I had 50 errors, now they are more than 500 after this plug in. The plug in removed the previous errors, but now there are many different errors. I have 2 options:
1. Remove the plug in
2. Do something with new errors - the new errors are only because of search, I have dublicate page content because when you type PDODUCT NAME in search box, there is same content as www.mydomain.com/category1/PRODUCT NAME
Maybe this plug in removed the canonical urls in search or I dont know what.
In robots.txt there is row:
Disallow: /*?route=product/search
The duplicate content is mydomain.com/product/search&filter_tag=XXXXXX
Instead of XXXXX there are many paths.
I decided to add another row in robots.txt:
Disallow: /*?route=product/search&filter_tag=/
Do you thing it is correct or to remove the plug in?
I hope you understand what is the problem.
-
When you no index a page, any links pointing to those pages pour away link juice from you indexed pages. you should never no-index pages IMO
I assume you are using a CMS or some sort of plug in, this is a common cost when you do so. CMS create very untidy code, not good for SEO
-
The urls are: /product/search&filter_tag=%D0%B1%D0%B8%D0%B6%D1%83%D1%82%D0%B0
after = there are a lot of combinations. Is it correct to put this in robots.txt
Disallow: /*?route=product/search&filter_tag=/
-
Sholud I disallow search (in robots.txt)?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Combining products - edit existing product page or 301 redirect to new page?
We want to combine existing products - e.g. 'hand lotion' and 'body lotion' will become 'hand & body lotion'. As such, we'll need to combine the two product pages into one. What would be the best route to take in terms of SEO to do this? My initial reaction is to create a new product page and then 301 or 302 redirect the old products to the new product page depending on if the change is permanent or temporary. Would you agree? Or am I missing something?
On-Page Optimization | | SwankyApple1 -
Update old article or publish new content and redirect old post?
Hi all, I'm targetting a keyword and we used to rank quite good for it. Last couple of months traffic of that keyword (and variations) is going down a bit. I wrote an extensive new post on the same topic, much more in dept and from 600 to 1800 words covering the same topic. Is it better to update the old article and mention that it's updated recently, or publish a new post and redirect the old post to the new post?
On-Page Optimization | | jorisbrabants0 -
New to MOZ and so far love tools but need some quick tips on title, keywords and descriptions
First of all:
On-Page Optimization | | nickcargill
I am excited about using all the MOZ tools! I just got back my webcrawl and have found
a lot of issues that I am working on. I am a Vacation Rental Property Manager with 150 properties, all different. 90% of my pages have the same keywords and descriptions and a lot of same page titles
too. I can change all of these by adding a few fields in my database and then
populating the meta information dynamically. For Example: <title>Big Bear Cabins | Big Bear Cabin Rentals - Chateau Alta Vista Cabin</title>
Chateau Alta Vista, Big Bear Cabins, Big Bear Cabin Rentals
Chateau ALta vista is a Big Bear Cabin with four bedrooms, three
bathrooms, it is Big Bear Luxury Property. If you notice the Title, keyword are dynamically populated from the database using the
property name of Chateau Alta Vista. The description is an extra field in the database that I just implemented that I can customize per property. I have a few questions, but here is more information: I also have 5 or 6 related pages
to each property. Pages Like general information, photos, calendars, book
Question 1): Is the best way of doing my titles, keywords and descriptions. Any concerns or recommendations. I have read, do not even use keywords anymore. Question 2) I have other pages that show maybe 8 or 10 photos. So is it ok to do the following; Title <photos of="" big="" bear="" cabin="" chateu="" alta="" vista<="" title="">Meta Keywords <photos of="" big="" bear="" cabins="" chateu="" alta="" vista="">Meta Description <check out="" photos="" of="" big="" bear="" cabin="" rentals,="" chateua="" <br="">Alta Vista, etc Currently, I have some archaic page naming with database driven url parameters such as</check></photos></photos> http://destinationbigbear.com/property_Detail_v.aspx?propid=669 and at worse http://destinationbigbear.com/Property_detail_v.aspx?propid=669&rate= $1,127.10&checkoutdate=07/13/2014&beds=4&firstnight=07/11/2014&nights=2 I do not have he ability to to full url encoding such as http://destinationbigbear.com/cabins/chateau-alta-vista but I can do http:/www.destinationbigbear.com/big-bear-cabin-info.aspx?cabin=chateau%alta%vista Question is what do I do... If I do change the page names I will lose the history of property_detail_V.aspx which only has a Page Authority of 21, but if I change the page name and dynamic navigation of my website will I not lose all my authorities, if so, is it worth doing it? My highest keyword ranking is 30 which is terrible. Nick0 -
Big shift in local results with Penguin 2.0?
Is anyone else seeing a big shift in local results since Penguin 2.0. I am logged out of Google, all personalisation switched off but my results are being personalised to my location for all results, not just Google+ Local results? I'm pretty sure penguin is bigger than 2.3% of queries or whatever was reported..
On-Page Optimization | | Karen_Dauncey0 -
Corporate Re-Branding, How to best lead into new website?
I have a company that is going to be moving away from a name they have had for 40 years, into a name they have had for the past 10. They are becoming a bigger brand in Canada and want to move away from the west coast name. http://www.wcmachinery.net/ is becoming obsolete. http://shearforce.ca/ will be taking over for both sites. Both of the websites are identically built by a previous web company. The current root domain metrics are shown in the image provided. What do you Mozzers feel would be the best way to phase out the old site without losing the little bit of domain authority wcmachinery.net has gained? west coast machinery is also well know to its customers, so getting the branded search traffic to the shearforce site is also a priority. I am fairly certain this can be handled easily in the about page where it explains the relationship between the two companies. As a side note I have not performed any SEO work on either site as of yet. So please do not judge me based on this site! Thanks for your insights, Randy edit edit
On-Page Optimization | | PixelgemsCreative0 -
Page authority 1 for new URLs
Hi There Quite a beginner question. I have changed url structure last week and is already avaliable on google.What i find strange is that the PA reported by SEOMOZ is 1 and there's no google cache. If the page has to crawled yet, why it's avaliable on google index already? Dario
On-Page Optimization | | Mrlocicero0 -
Problem with left navigation links on an e-commerce site diluting pagerank
I'm trying to decide how to deal with left navigation links on my e-commerce website diluting the amount of link juice passed to other links on the page. Any suggestions? Only options I can think of are: Nofollow the links use javascript (I'm assuming googlebots are still able to find these) Leave them as they are as followed links
On-Page Optimization | | Ralzaider0 -
How to Resolve Google Crawling Issues for My eCommerce Website?
I want to resolve Google crawling issues for my eCommerce website. My website is as follow. http://www.vistastores.com/ Google have crawled only 97 webpages from my website. My website is quite old. (~More than 6 months) But, Google have indexed only 97 webpages. I have created one campaign over SEOmoz tool and found some errors over there. So, I just assumed that due to it Google did not crawled my website. But, I have created one another campaign for my competitor website to know actual status and reason behind it. I found that, my competitor website have more error compare to me but, Google have crawled maximum pages compare to me. So, What is reason behind it? How can I improve my crawling rate and index maximum webpages to Google? [6133009604_af85d29730_b.jpg](img src=) 6133009604_af85d29730_b.jpg 6133009604_af85d29730_b.jpg 6139706697_4e252fdb82_b.jpg
On-Page Optimization | | CommercePundit0