Regex in Disavow Files?
-
Hi,
Will Regex expressions work in a disavow file?
If i include website.com/* will that work or would you recommend just website.com?
Thanks.
-
Hi Fubra,
You can disavow at a domain level, so no regex is required (and I don't think it will work).
Just add "domain:" before the domain, eg. domain:spammysite.com
Marie Haynes wrote a good guide to using the disavow tool here if you need any further information: https://mza.seotoolninja.com/blog/guide-to-googles-disavow-tool
Cheers,
David
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I better noindex 'scripted' files in our portfolio?
Hello Moz community, As a means of a portfolio, we upload these PowerPoint exports – which are converted into HTML5 to maintain interactivity and animations. Works pretty nicely! We link to these exported files from our products pages. (We are a presentation design company, so they're pretty relevant). For example: https://www.bentopresentaties.nl/wp-content/portfolio/ecar/index.html However, they keep coming up in the Crawl warnings, as the exported HTML-file doesn't contain text (just code), so we get errors in: thin content no H1 missing meta description missing canonical tag I could manually add the last two, but the first warnings are just unsolvable. Therefore I figured we probably better noindex all these files… They appear to don't contain any searchable content and even then; the content of our clients work is not relevant for our search terms etc. They're mere examples, just in the form of HTML files. Am I missing something or should I better noindex these/such files? (And if so: is there a way to include a whole directory to noindex automatically, so I don't have to manually 'fix' all the HTML exports with a noindex tag in the future? I read that using disallow in robots.txt wouldn't work, as we will still link to these files as portfolio examples).
Intermediate & Advanced SEO | | BentoPres0 -
Set Robots.txt file to crawl my website at specific times
Our website provider has stated that they can only 'lift' their block on our website in order for it to be crawled as specific times. Is there any way to amend a robots.txt to ensure that it crawls our website at a specific time of day/night in order to coincide with the block being lifted? Many Thanks, Charlene
Intermediate & Advanced SEO | | CharleneKennedy120 -
Disavow- What Happens and What Should I Do?
We have a site that got hit by a non-manual penalty in July really hard. I submitted a disavow file for the site placeyourlinks.com which had a bunch of clearly spammy links to the site listed in Webmaster tools. But the site itself was down for a long time so I couldn't see where the links even were. Then those links disappeared from the links file. I thought the urls were removed or the site was seen as being blank. But now they're back...and the site itself is shown as just being a blank page. I don't know what to do since I don't want to disavow those links again if it wasn't even addressed the first time and there is obviously no way to contact the site. Help! Also, I've done a bunch of work on the site to increase the amount of content while I was waiting to see what happened with the link disavow. But now all that is done and our rankings are still waaaay down. I'm considering getting really, really aggressive with link removal and disavowing if needed but I'm not sure what I should focus on removing/disavowing. Really bad sites with only one or two links? Sites that have a lot of links to the site? Sites with keyword stuffy anchor text? Any help on this would be much appreciated.
Intermediate & Advanced SEO | | Fuel0 -
To recover from Penguin update, shall i remove the links or disavow links?
Hi, One of our websites hit by Penguin update and I now know where the links are coming from. I have chance to remove the links from those incoming links but I am a little confused whether i should just remove the links from incoming links or disavow the links? Thanks
Intermediate & Advanced SEO | | Rubix0 -
Duplicate Content From Indexing of non- File Extension Page
Google somehow has indexed a page of mine without the .html extension. so they indexed www.samplepage.com/page, so I am showing duplicate content because Google also see's www.samplepage.com/page.html How can I force google or bing or whoever to only index and see the page including the .html extension? I know people are saying not to use the file extension on pages, but I want to, so please anybody...HELP!!!
Intermediate & Advanced SEO | | WebbyNabler0 -
Disavow tool removed all our links from webmaster tools
We recently used the Google Disavow tool to remove 200 bad links but Google has nearly removed all our links from webmaster tools from over 2000+ we only have 150 now! Has anyone had the same problem? Any advice would be much appreciated. Thanks Paul
Intermediate & Advanced SEO | | webdesigncwd0 -
Should I bother disavowing nofollow backlinks?
Hello! I am about to go through the list of backlinks in our profile and sort out what we want to disavow. A question I had is, should I both disavowing nofollow backlinks, even if they look spammy? Or should I just focus on cleaning up the dofollow's? Thanks!
Intermediate & Advanced SEO | | Ryan_Phillips0 -
Negative impact on crawling after upload robots.txt file on HTTPS pages
I experienced negative impact on crawling after upload robots.txt file on HTTPS pages. You can find out both URLs as follow. Robots.txt File for HTTP: http://www.vistastores.com/robots.txt Robots.txt File for HTTPS: https://www.vistastores.com/robots.txt I have disallowed all crawlers for HTTPS pages with following syntax. User-agent: *
Intermediate & Advanced SEO | | CommercePundit
Disallow: / Does it matter for that? If I have done any thing wrong so give me more idea to fix this issue.0