Robots.txt & Disallow: /*? Question!
-
Hi,
I have a site where they have:
Disallow: /*?
Problem is we need the following indexed:
?utm_source=google_shopping
What would the best solution be? I have read:
User-agent: *
Allow: ?utm_source=google_shopping
Disallow: /*?Any ideas?
-
User-agent: * Disallow: /cgi-bin/ Disallow: /wp-admin/ Disallow: /archives/ Disallow: /? Allow: /comments/feed/ Disallow: /refer/ Disallow: /index.php Disallow: /wp-content/plugins/ Allow: /wp-admin/admin-ajax.php User-agent: Mediapartners-Google* Allow: / User-agent: Googlebot-Image Allow: /wp-content/uploads/ User-agent: Adsbot-Google Allow: / User-agent: Googlebot-Mobile Allow: / Sitemap: https://site.com/sitemap_index.xml
use this it will help you and your problem will solve
Regards
-
User-agent: * Disallow: /cgi-bin/ Disallow: /wp-admin/ Disallow: /archives/ Disallow: /? Allow: /comments/feed/ Disallow: /refer/ Disallow: /index.php Disallow: /wp-content/plugins/ Allow: /wp-admin/admin-ajax.php User-agent: Mediapartners-Google* Allow: / User-agent: Googlebot-Image Allow: /wp-content/uploads/ User-agent: Adsbot-Google Allow: / User-agent: Googlebot-Mobile Allow: / Sitemap: https://site.com/sitemap_index.xml
this will work ??
Regards
Sajad -
User-agent: * Disallow: /cgi-bin/ Disallow: /wp-admin/ Disallow: /archives/ Disallow: /*?* Allow: /comments/feed/ Disallow: /refer/ Disallow: /index.php Disallow: /wp-content/plugins/ Allow: /wp-admin/admin-ajax.php User-agent: Mediapartners-Google* Allow: / User-agent: Googlebot-Image Allow: /wp-content/uploads/ User-agent: Adsbot-Google Allow: / User-agent: Googlebot-Mobile Allow: / Sitemap: https://site.com/sitemap_index.xml use this it will help you Regards [Saad](https://clicktestworld.com/)
-
Hi Jeff,
Robots.txt tester as per the above link is definitely worth playing with and is the easiest route to achieving what you want.
Another reactive way of managing this is in some cases is to simply see the range of parameters Google has naturally crawled within Search Console.
You can see this in the old search console for now. So login and go to Crawl --> URL Parameters.
If Googlebot has encountered any ?=params it will list them. You'll then have an option how to manage them or exclude them from the index.
It can be a decent way of cleaning up a site with lot's of indexed pages (1,000+), although please be sure to read this documentation before using it: https://support.google.com/webmasters/answer/6080548?hl=en
-
With this kind of thing, it's really better to pick the specific parameters (or parameter combinations) which you'd like to exclude, e.g:
User-agent: *
Disallow: /shop/product/&size=*
Disallow: */shop/product/*?size=*
Disallow: /stockists?product=*
^ I just took the above from a robots.txt file which I have been working on, as these particular pages don't have 'pretty' URLs with unique content on. Very soon now that will change and the blocks will be lifted
If you are really 100% sure that there's only one param which you want to let through, then you'd go with:
User-agent: *
Disallow: /?
Allow: /?utm_source=google_shopping
Allow: /*&utm_source=google_shopping*
(or something pretty similar to that!)
Before you set anything live, get down a list of URLs which represent the blocks (and allows) which you want to achieve. Test it all with the Robots.txt tester (in Search Console) before you set anything live!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate H1 Question & Landing Page help
Hi We have 2 H1's on this page http://www.key.co.uk/en/key/heavy-duty-shelving Our webmaster has put one as display:none - but isn't this just going to look like we're keyword spamming & trying to hide it? OK now I;m looking I am seeing more wrong with this page... The width buttons at the top as h2's...& they link to facet pages? Won't this just waste crawl budget? and every product title/user guide title etc are all H2's.... I just need to put a plan together to give to our dev team on what should be updated Any tips would be great. Becky
Intermediate & Advanced SEO | | BeckyKey0 -
HTTPS 301 Redirect Question
Hi, I've just migrated our previous site (siteA) to our new url (siteB) and I've setup 301 redirects from the old url (siteA) to the new (siteB). However, the old url operated on https and users who try to go to the old url with https (https://siteA.com) receive a message that the server cannot be reached, while the users who go to http://siteA.com are redirected to siteB. Is there a way to 301 redirect https traffic? Also, from an SEO perspective if the site and all the references on Google search are https://siteA.com does a 301 redirect of http pass the domain authority, etc. or is https required? Thanks.
Intermediate & Advanced SEO | | opstart0 -
"noindex, follow" or "robots.txt" for thin content pages
Does anyone have any testing evidence what is better to use for pages with thin content, yet important pages to keep on a website? I am referring to content shared across multiple websites (such as e-commerce, real estate etc). Imagine a website with 300 high quality pages indexed and 5,000 thin product type pages, which are pages that would not generate relevant search traffic. Question goes: Does the interlinking value achieved by "noindex, follow" outweigh the negative of Google having to crawl all those "noindex" pages? With robots.txt one has Google's crawling focus on just the important pages that are indexed and that may give ranking a boost. Any experiments with insight to this would be great. I do get the story about "make the pages unique", "get customer reviews and comments" etc....but the above question is the important question here.
Intermediate & Advanced SEO | | khi50 -
Robot.txt error
I currently have this under my robot txt file: User-agent: *
Intermediate & Advanced SEO | | Rubix
Disallow: /authenticated/
Disallow: /css/
Disallow: /images/
Disallow: /js/
Disallow: /PayPal/
Disallow: /Reporting/
Disallow: /RegistrationComplete.aspx WebMatrix 2.0 On webmaster > Health Check > Blocked URL I copy and paste above code then click on Test, everything looks ok but then logout and log back in then I see below code under Blocked URL: User-agent: * Disallow: / WebMatrix 2.0 Currently, Google doesn't index my domain and i don't understand why this happening. Any ideas? Thanks Seda0 -
Question For Anyone
Hi All, Would you be able to answer one small question If you go to Australian Google - www.google.com.au and search for "loans" on positions number # 38 you will see the following site paydayloansyouknow.com.au . It has only 3 pages , 0 links, PA 1,and DA 1 How it's possible to archive such results? This is the print screen in case you dont see what i am asking about
Intermediate & Advanced SEO | | Webdeal
( http://www.freeimagehosting.net/oa75d Will appreciate any answer?0 -
"Iffy" Question
Hi Guys and Girls, I have been studying SEO for a few years now. I have learned quite a bit along the way. One thing that I have had BEAT in my head is "Create Quality Content". When I used to ask "What Should I do to get more visits?" I was told, "Create QUALITY CONTENT". That was great advice, and I have done that. I have created over 600 pages since then. I went from 200 visits per month to now just over 5,000. Even more importantly, I have increased my lead conversions. I say all of this because, about two years ago I told a competitor basically how you rise up in the search engines. He turned around and bought a domain that was 3 years older than mine and had a main keyword in the domain. He then just started building links with a bunch of blog comments and forum posts (The blog comments have like 2,000 comments on them). In other words, he did what I would NOT do yet he is #5 for this keyword and I am #9. (Although, I turned around and built another site and now I am at #13 for that and it has been up for a year now. I say all this, not to bore you but to tell ask you, DOES IT WORK IF YOU DO BLOG COMMENTING, FORUM POSTING, LINK WHEELS, ETC. Do you obtain higher rankings? Should I be doing a bit of video marketing? I know a lot of people will say, "Why would you spend your time on things that may slightly impact your rankings, but obviously it did something for this guy. Any help you can give would be greatly appreciated.
Intermediate & Advanced SEO | | blake-766241 -
How And/Or If To Prune Footer Links
Hi, I have a site with a site-wide footer that currently has 28 internal links.The footer terms are the terms the pages are focused on. This footer is on every page of the site (hundreds of pages). Some pages of my site have 10 or so additional links pointing to internal and external pages (besides the footer) and some pages (like the homepage) have about 50 links besides the footer. I'm going for a half dozen new terms with new pages that I would be adding to the site-wide footer. Do you think I should trim the existing footer before adding these new terms? I guess I would remove the terms that show no real hope of ever getting to page one... like pages stuck in the 40s. Or, pages I for whatever reason don't care much if they rank or not. Would trimming it to a smaller number do more to help the remaining linked pages/terms? What do you think? Thanks!
Intermediate & Advanced SEO | | 945010 -
Image Links Vs. Text Links, Questions About PR & Anchor Text Value
I am searching for testing results to find out the value of text links versus image links with alt text. Do any of you have testing results that can answer or discuss these questions? If 2 separate pages on the same domain were to have the same Page Authority, same amount of internal and external links and virtually carry the same strength and the location of the image or text link is in the same spot on both pages, in the middle of the body within paragraphs. Would an image link with alt text pass the same amount of Page Authority and PR as a text link? Would an image link with alt text pass the same amount of textual value as a text link? For example, if the alt text on the image on one page said "nike shoes" and the text link on the other page said "nike shoes" would both pass the same value to drive up the rankings of the page for "nike shoes"? Would a link wrapped around an image and text phrase be better than creating 2 links, one around the image and one around the text pointing to the same page? The following questions have to do with when you have an image and text link on a page right next to each other, like when you link a compelling graphic image to a category page and then list a text link underneath it to pass text link value to the linked-to page. If the image link displays before the text link pointing to a page, would first link priority use the alt text and not even apply the anchor text phrase to the linked page? Would it be best to link the image and text phrase together pointing to the product page to decrease the link count on the page, thus allowing for more page rank and page authority to pass to other pages that are being linked to on the page? And would this also pass anchor text value to the link-to page since the link would include an image and text? I know that the questions sound a bit repetitive, so please let me know if you need any further clarification. I'd like to solve these to further look into ways to improve some user experience aspects while optimizing the link strength on each page at the same time. Thanks!
Intermediate & Advanced SEO | | abernhardt
Andrew0