Should I noindex shop page and blog page for SEO?
-
I have about 15 products in my store. Should I noindex shop and blog page for SEO?
The reason I ask this question is because I see someone suggesting noindex archives pages. And the shop page is product archive and blog page is archive too, so should I choose index or noindex?
Thanks!
-
You should index the page, we do not understand very well what you mean, if you throw away a blog and a shop because you do not want to index it?
The most normal is to put no index on pages that we do not care as for example the legal notice or the cookie policy
-
You shouldn't noindex your blog or shop pages.
But I am confused on what you are saying about your blog and shop pages being archive pages. Can you explain further please?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What would be the best course of action to nullify negative effects of our website's content being duplicated (Negative SEO)
Hello, everyone About 3 months ago I joined a company that deals in manufacturing of transportation and packaging items. Once I started digging into the website, I noticed that a lot of their content was "plagiarized". I use quotes as it really was not, but they seemed to have been hit with a negative SEO campaign last year where their content was taken and being posted across at least 15 different websites. Literally every page on their website had the same problem - and some content was even company specific (going as far as using the company's very unique name). In all my years of working in SEO and marketing I have never seen something at the scale of this. Sure, there are always spammy links here and there, but this seems very deliberate. In fact, some of the duplicate content was posted on legitimate websites that may have been hacked/compromised (some examples include charity websites. I am wondering if there is anything that I can do besides contacting the webmasters of these websites and nicely asking for a removal of the content? Or does this duplicate content not hold as much weight anymore as it used to. Especially since our content was posted years before the duplicate content started popping up. Thanks,
White Hat / Black Hat SEO | | Hasanovic0 -
Will pillar posts create a duplication content issue, if we un-gate ebook/guides and use exact copy from blogs?
Hi there! With the rise of pillar posts, I have a question on the duplicate content issue it may present. If we are un-gating ebook/guides and using (at times) exact copy from our blog posts, will this harm our SEO efforts? This would go against the goal of our post and is mission-critical to understand before we implement pillar posts for our clients.
White Hat / Black Hat SEO | | Olivia9540 -
Submitting a page to Google Search Console or Bing Webmaster Tools with nofollow tags
Hello, I was hoping someone could help me understand if there is any point to submit a domain or subdomain to Google Search Console (Webmaster Tools) and Bing Webmaster Tools if the pages (on the subdomain for example) all have nofollow/noindex tags ... or the pages are being blocked by the robots.txt file). There are some pages on a data feed onto a subdomain which I manage that have these above characteristics ... which I cannot change ... but I am wondering if it is better to simply exclude from submitting those from GWT and BWT (above) thereby eliminating generating errors or warnings ... or is it better to tell Google and Bing about them anyway then perhaps there is a chance those nofollow pages may be indexed/contextualised in some way, making it worth the effort? Many thanks!
White Hat / Black Hat SEO | | uworlds
Mark0 -
Massive site-wide internal footer links to doorway pages: how bad is this?
My company has stuffed several hundred links into the footer of every page. Well, technically not the footer, as they're right at the end of the body tag, but basically the same thing. They are formatted as follows: [" href="http://example.com/springfield_oh_real_estate.htm">" target="_blank">http://example.com/springfield_pa_real_estate.htm">](</span><a class= "http://example.com/springfield_oh_real_estate.htm")springfield, pa real estate These direct to individual pages that contain the same few images and variations the following text that just replace the town and state: _Springfield, PA Real Estate - Springfield County [images] This page features links to help you Find Listings and Homes for sale in the Springfield area MLS, Springfield Real Estate Agents, and Springfield home values. Our free real estate services feature all Springfield and Springfield suburban areas. We also have information on Springfield home selling, Springfield home buying, financing and mortgages, insurance and other realty services for anyone looking to sell a home or buy a home in Springfield. And if you are relocating to Springfield or want Springfield relocation information we can help with our Relocation Network._ The bolded text links to our internal site pages for buying, selling, relocation, etc. Like I said, this is repeated several hundred times, on every single page on our site. In our XML sitemap file, there are links to: http://www.example.com/Real_Estate/City/Springfield/
White Hat / Black Hat SEO | | BD69
http://www.example.com/Real_Estate/City/Springfield/Homes/
http://www.example.com/Real_Estate/City/Springfield/Townhomes/ That direct to separate pages with a Google map result for properties for sale in Springfield. It's accompanied by the a boilerplate version of this: _Find Springfield Pennsylvania Real Estate for sale on www.example.com - your complete source for all Springfield Pennsylvania real estate. Using www.example.com, you can search the entire local Multiple Listing Service (MLS) for up to date Springfield Pennsylvania real estate for sale that may not be available elsewhere. This includes every Springfield Pennsylvania property that's currently for sale and listed on our local MLS. Example Company is a fully licensed Springfield Pennsylvania real estate provider._ Google Webmaster Tools is reporting that some of these pages have over 30,000 internal links on our site. However, GWT isn't reporting any manual actions that need to be addressed. How blatantly abusive and spammy is this? At best, Google doesn't care a spit about it , but worst case is this is actively harming our SERP rankings. What's the best way to go about dealing with this? The site did have Analytics running, but the company lost the account information years ago, otherwise I'd check the numbers to see if we were ever hit by Panda/Penguin. I just got a new Analytics account implemented 2 weeks ago. Of course it's still using deprecated object values so I don't even know how accurate it is. Thanks everyone! qrPftlf.png0 -
Rel Noindex Nofollow tag vs meta noindex nofollow robots
Hi Mozzers I have a bit of thing I was pondering about this morning and would love to hear your opinion on it. So we had a bit of an issue on our client's website in the beginning of the year. I tried to find a way around it by using wild cards in my robots.txt but because different search engines treat wild cards differently it dint work out so well and only some search engines understood what I was trying to do. so here goes, I had a parameter on a big amount of URLs on the website with ?filter being pushed from the database we make use of filters on the site to filter out content for users to find what they are looking for much easier, concluding to database driven ?filter URLs (those ugly &^% URLs we all hate so much*. So what we looking to do is implementing nofollow noindex on all the internal links pointing to it the ?filter parameter URLs, however my SEO sense is telling me that the noindex nofollow should rather be on the individual ?filter parameter URL's metadata robots instead of all the internal links pointing the parameter URLs. Am I right in thinking this way? (reason why we want to put it on the internal links atm is because the of the development company states that they don't have control over the metadata of these database driven parameter URLs) If I am not mistaken noindex nofollow on the internal links could be seen as page rank sculpting where as onpage meta robots noindex nofolow is more of a comand like your robots.txt Anyone tested this before or have some more knowledge on the small detail of noindex nofollow? PS: canonical tags is also not doable at this point because we still in the process of cleaning out all the parameter URLs so +- 70% of the URLs doesn't have an SEO friendly URL yet to be canonicalized to. PSS: another reason why this needs looking at is because search engines won't be able to make an interpretation of these pages (until they have been cleaned up and fleshed out with unique content) which could result in bad ranking of the pages which could conclude to my users not being satisfied, so over and above the SEO factor, usability of the site is being looked at here as well, I don't want my users to land on these pages atm. If they navigate to it via the filters then awesome because they are defining what they are looking for with the filters. Would love to hear your thoughts on this. Thanks, Chris Captivate.
White Hat / Black Hat SEO | | DROIDSTERS0 -
How best to do Location Specific Pages for Eccomerce Post Panda Update..
Hi , We have an eCommerce site and currently we have a problem with duplicate content. We created Location specific landing pages for our product categories which initially did very well until the recent Google Panda update caused a big drop in ranking and traffic. example http://xxx.co.uk/rent/lawn-mower/London/100 http://.xxx.co.uk/rent/lawn-mower/Manchester/100 Much of the content on these location pages is the same or very similar apart from different H1 tag, title tag and in some cases slight variations on the on page content but given that these items can be hired from 200 locations it would take years to have unique content for every location for each category... We did this originally in April as we can't compete nationally but we found it was easier to compete locally ,hence the creation of the location pages and it did do well for us until now. My question is , since the last Google Panda update, our traffic has dropped 40% , and rankings have gone through the floor and we are stuck with this mess Should we get rid off (301 ) all of the location specific pages for each of the categories ot just keep say 10 locations per cities as the most popular ones and either do No follow no index the other locations or 301's or what would people recommend ? The only examples I can see on the internet that others do with multiple locations is to have a store finder type thing... but you cant' rank for the individual product /category doing it that way... If anyone has any advice or good examples of sites I could see that employ a good location url specific method, please let me know. thanks Sarah
White Hat / Black Hat SEO | | SarahCollins0 -
Is my SEO strategy solid moving forward (post panda update) or am I doing risky things that might hurt my sites down the road?
Hey all, WIhen I first started doing SEO, I was encouraged by several supposed experts that it was a good idea to buy links from "respectable" sources and as well make use of SEO experimentation offered on Fiverr. I did that a lot for the clients I represented not knowing if this was going to hurt. But now after the latest Google shift, I am realizing that this was stupid and thus deserving of the ranking drops I have received. In the aftermath, I want to list out here what I am doing now to try to build better and stronger rankings for my sites using white hat techniques only... Below is a list of what I'm doing. Please let me know if any of these are bad choices and I will immediately dump them. Also, If i am not including some good options, please let me know that too. I am really embarrassed and humbled by this and could use whatever help you can offer. Thanks in advance for your help... What am I doing now? *Writing quality articles for external blogs with keyword links back to sites *Taking the above articles and spinning them at SEOLINKVINE to create several articles *Writing quality articles for every site's internal blog and using keywords to link out to other sites that are on different servers - All articles are original, varied and not duplicate content. *Writing quality, relevant articles and submitting them to places like Ezine *Signing clients up for Facebook, Yelp, Twitter, etc so they have a social presence *Working to fix mistakes with onsite issues (mirror sites, duplicate page titles, etc.) *Writing quality keyword-rich unique content on each page of each site *Submitting URL listings and descriptions to directories like JoeAnt, REALS and business.com (Any other good ones that people can recommend that give good link juice?) *Doing competitive research and going after highly authoritative links that our competitors have That is about it... HELP!!! Thanks again
White Hat / Black Hat SEO | | creativeguy0 -
Herbal Viagra page same DA/PA as UC Berkeley??
Either there is some amazingly good SEO work going on here, or Google has an amazingly large hole in their metrics. http://nottowait.com/ http://www.ucdavis.edu/index.html The "nottowait" page has a PA of 85?! and a DA of 82?! The page is HORRIBLE. The page itself is an image of another page. The nav bar does not function, nor does any of the "click here" links. At the bottom there is a paragraph of keywords and broken english. This page is pure junk and should simply not have any value at all with respect to DA nor PA. It has a ton of incoming links from various sources which seem to be the source of all this value, which it passes on to other pages. This page really is an affront to the "content is king" concept. I suppose I should ask a question but all I can think of is, what is Matt Cutts' phone number? I want to ask him how this page has gotten away with being ranked so well for so long.
White Hat / Black Hat SEO | | RyanKent0