Anyone Used ScrapeBox or SEONukeX Before?
-
I have been looking at trying out Scrapebox or SEONukex for a while, but don't want to wast my money. Has anyone tried them out with positive success? I am not looking for an automated submission platform necessarily. I am simply looking for a platform to tell me which sites are relevant to mine, dofollow, etc. That is what I would be using them for.
-
Scrapebox is an excellent tool for blog and forum discovery. SENukeX doesn't really help at all in that department and is really only a decent tool if you find creative ways to use it like building your own blog networks.
-
Yes, thank you for that information. I was not wanting to use scrapebox as a means to auto-generate links. I am only needing something to help me in the discovery process of finding sites to get links from.
-
Key Keri,
Thanks for the link, was a good read. Funny thing, once I figured out how SENuke spins articles, I started to notice them. Several times I have found myself reading an article and think to myself that I would really enjoy reading the original version of that article. Frankly, I can't stand spun articles and hopefully people and search engines can learn the difference between an original article and a spun one. I really can't stand spun articles and think anyone doing it should be penalized.
Having said that, if I can read a spun article, and I probably have, and I don't notice. Good enough for me.
I would also expect the search engines to be a little more aggressive about spun articles than they are about paid links. Your competitor is much less likely to spin articles on your behalf than they are to build crappy links for you.
David
-
Check out this thread from earlier this month, where someone was evaluating SENuke and decided against it. You can read his experience and the opinion of other people as well. Generally, it was not a positive opinion.
-
I loaded it on my computer and it looked like it was hard to use, at best. After educating myself more about what SEO really is, I decided against actually using it. IMO it may have been good at one time, but I think the search engines are getting wise to this kind of thing. It looks like a really good way to get sandboxed to me.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using hreflang="en" instead of hreflang="en-gb"
Hello, I have a question in regard to international SEO and the hreflang meta tag. We are currently a B2B business in the UK. Our major market is England with some exceptions of sales internationally. We are wanting to increase our ranking into other english speaking countries and regions such as Ireland and the Channel Islands. My research has found regional google search engines for Ireland (google.ie), Jersey (google.je) and Guernsey (google.gg). Now, all the regions have English as one their main language and here is my questions. Because I use hreflang=“en-gb” as my site language, am I regional excluding these countries and islands? If I used hreflang=“en” would it include these english speaking regions and possible increase the ranking on these the regional search engines? Thank you,
Intermediate & Advanced SEO | | SilverStar11 -
Does anyone have a clue about my search problem?
After three years of destruction, my site still has a problem - or maybe more than one. OK, I understand I had - and probably still have - a Panda problem. The question is - does anyone know how to fix it, without destroying eveything? If I had money, I'd gladly give it up to fix this, but all I have is me, a small dedicated promotions team, 120,000+ visitors per month and the ability to write, edit and proofread. This is not an easy problem to fix. After completing more than 100 projects, I still haven't got it right, in fact, what I've done over the past 2 months has only made things worse - and I never thought I could do that. Everything has been measured, so as not to destroy our remaining ability to generate income, because without that, its the end of the line. If you can help me fix this, I will do anything for you in return - as long as it is legal, ethical and won't destroy my reputation or hurt others. Unless you are a master jedi guru, and I hope you are, this will NOT be easy, but it will prove that you really are a master, jedi, guru and time lord, and I will tell the world and generate leads for you. I've been doing website and SEO stuff since 1996 and I've always been able to solve problems and fix anything I needed to work on. This has me beaten. So my question is: is there anyone here willing to take a shot at helping me fix this, without the usual response of "change domains" "Delete everything and start over" or "you're screwed" Of course, it is possible that there is a different problem, nothing to do with algorithms, a hard-coded bias or some penalizing setting, that I don't know about, a single needle in a haystack. This problem results in a few visible things. 1. Some pages are buried in supplemental results 2. Search bots pick up new stories within minutes, but they show up in search results many hours later Here is the site: http://shar.es/EGaAC On request, I can provide a list of all the things we've done or tried. (actually I have to finish writing it) Some Notes: There is no manual spam penalty. All outgoing links are nofollow, and have been for 2 years. We never paid for incoming links. We did sell text advertising links 3-4 years ago, using text-link-ads.com, but removed them all 2 1/2 years ago. We did receive payment for some stories, 3-4 years ago, but all have been removed. One more thing. I don't write much - I'm a better editor than a writer, but I wrote a story that had 1 million readers. the massive percentage of 0.0016% came from you-know-who. Yes, 16 visitors. And this was an exclusive, unique story. And there was a similar story, with half a million readers. same result. Seems like there might be a problem!
Intermediate & Advanced SEO | | loopyal0 -
Using PushState for Meta Data?
Wondering if anyone has had any experience using pushstate to update meta data on a AJAX page. What we are trying to is have one really long page that users can scroll through to see different portfolio pieces. We want each portfolio piece to be represented in Google as a separate page when they are technically all on the same page. An example of how the page will work is here:
Intermediate & Advanced SEO | | lsujoe
http://www.scozzese.com/2011/en/#annasafroncik If you notice you scroll down and the url will update for the next piece but you are still on the same page. So if we do this for meta title, meta description - will Google be able to recognize it? Any help to achieve quality results would be helpful! If I didn't explain anything clearly please let me know!0 -
Anyone have an hour right now to cover some SEO questions
Hi folks, I need someone to Skype with me today, on some seo questions, for a multi wordpress set up I`m in middle of developing for franchise local sites. Will pay you $95 for the hour. Thanks Brent Sky pe me: cyberbrent (Brent H, Richmond, BC)
Intermediate & Advanced SEO | | MenInKilts1 -
New Site: Use Aged Domain Name or Buy New Domain Name?
Hi,
Intermediate & Advanced SEO | | peterwhitewebdesign
I have the opportunity to build a new website and use a domain name that is older than 5 years or buy a new domain name. The aged domain name is a .net and includes a keyword.
The new domain would include the same keyword as well as the U.S. state abbreviation. Which one would you use and why? Thanks for your help!0 -
Using exact keyword domains for local SEO
The website is for the attorney that serves several nearby cities. The main page is optimized for the biggest central city. I have several options how to go after the smaller surrounding cities: 1. Create optimized pages inside the main domain 2. Get more or less exact keyword domains for each city e.g. for the city ABC get yourABClawyer.com and then a) use 1 page websites that use the same template as main website and link all the menu items to the main website b)use 1 page website with a link "for more information go to our main website" c) point exact keyword domains to the optimized pages within the main domain. Which option would be the best in terms of SEO and user experience? Would people freak out if they click on the menu item and go to a different domain website even though it uses the same template (option 2a) Would I get more bounces with option 2b in your opinion? Would option 2c have any positive SEO effect? Should I not even bother with exact keyword domain and go with option 1?
Intermediate & Advanced SEO | | SirMax1 -
How are pages ranked when using Google's "site:" operator?
Hi, If you perform a Google search like site:seomoz.org, how are the pages displayed sorted/ranked? Thanks!
Intermediate & Advanced SEO | | anthematic0 -
Use of rel=canonical to view all page & No follow links
Hey, I have a couple of questions regarding e-commerce category pages and filtering options: I would like to implement the rel=canonical to the view all page as suggested on this article on googlewebmastercentral. If you go on one of my category pages you will see that both the "next page link" and the "view all" links are nofollowed. Is that a mistake? How does nofoolow combines with canonical view all? Is it a good thing to nofollow the "sorty by" pages or should I also use Noindex for them?
Intermediate & Advanced SEO | | Ypsilon0