Article Submissions - Still Worth it After Panda Update?
-
Are article submissions still relevant after the panda update? Many of these sites (ezinearticles) are still hit from the panda update.
-
Now it is my turn to agree with EGOL.
I would add one idea. In general I prefer not to share content but it can offer value to periodically share a selected article and gain backlinks to your site.
-
I agree 100% with what Ryan said.
Something else to consider is... when you give articles away to other websites you are giving them content that could make them competitors in the SERPs of your primary and secondary keywords. They could outrank you and cut off your traffic.
Also, if you give high quality articles away they will earn links for other websites instead of your own.
We never give articles away because we don't like to feed existing competitors and create new ones.
-
Yes and No.
YES - If you have a high quality article with good content, and you are submitting it to a quality site, then absolutely it is still worth it to do.
NO - If you are talking about the content farming practice where generic low quality articles are created that are 400 words with the key words thrown in periodically, and these articles are sent to article farm sites, then no.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Old product URLs still indexed and maybe causing problems?
Hi all, Need some expertise here: We recently (3 months ago) launched a newly updated site with the same domain. We also added an SSL and dropped the www (with proper redirects). We went from http://www.mysite.com to https://mysite.com. I joined the company about a week after launch of the new site. All pages I want indexed are indexed, on the sitemap and submitted (submitted in July but processes regularly). When I check site:mysite.com everything is there, but so are pages from the old site that are not on the sitemap. These do have 301 redirects. I am finding our non-product pages are ranking with no problem (including category pages) but our product pages are not, unless I type in the title almost exactly. We 301 redirected all old urls to new comparable product, or if the product is not available anymore to the home page. For better or worse, as it turns out and prior to my arrival, in building the new site the team copied much of the content (descriptions, reviews, etc) from the old site to create the new product pages. After some frustration and research I am finding the old pages are still indexed and possibly causing a duplicate content issue. Now, I gather there is supposedly no "penalty", per se, for duplicate content but a page or site will simply not show in the SERPs. Understandable and this seems to be the case. We also sell a lot of product wholesale and it turns out many dealers are using the same descriptions we have (and have had) on our site. Some are much larger than us so I'd expect to be pushed down a bit but we don't even show in the top 10 pages...for our own product. How long will it take for Google to drop the old and rank the new as unique? I have re-written some pages but much is technical specifications and tough to paraphrase or re-write. I know I could do this in Search Console but I don't have access to the old site any longer. Should I remove the 301s a few at a time and see if the old get dropped faster? Maybe just re-write ALL the content? Wait? As a site note, I'm also on a Drupal CMS with a Shopify ecommerce module so maybe the shop.mysite.com vs mysite.com is throwing it off with the products(?) - (again the Drupal non-product AND category pages rank fine). Thoughts on this would be much appreciated. Thx so much!
Intermediate & Advanced SEO | | mcampanaro0 -
Has My Site Been Hit by Panda 4.0?
I operate a New York City commercial real estate web site (www.nyc-officespace-leader.com). Ranking and traffic have dropped steeply since early June. Around May 20th a new Panda update was launched by Google and I wonder if that could partially explain the drop. My site contains the following: -300 listing pages. These are product pages and often contain less than 100 words. Many have not been changed in two years. -150 Building pages. These contain less than 220 words. Many have not been changed in two years. -40 blog pages. We have been adding 1 or 2 per month. -50 or 60 neighborhood and type of space pages. These contain 200-600 words. Could our drop in traffic be due to Panda? I might add that an upgraded version of the site with new forms, a modified right rail an header was launched on June 6th. Also, we submitted a disavow file with Google on April 20th for about 100 toxic domains, one third of the 300 domains that link to us. In order to take remedial action we need to understand what has happened. Any ideas??? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Updated My Business Profiles & Still Not Ranking in Local SEO... What Next?
I have used Get Listed and a few other services to update my profiles for my company Health Care Associates (its a home health care agency). http:healthcareassociates.net. I have added pictures, categories, descriptions, key words, etc. and it doesn't seem to help. We are still not ranking in the search engines for "home care grand rapids", "home health care" etc. What else can I do to optimize the local search? Todd
Intermediate & Advanced SEO | | t1kuslik0 -
Releasing Multiple Language Blog Articles ?
I was hoping anyone could give me some advice on my situation Our blog is a huge traffic source for us, we frequently release fresh blog articles on our English language website bringing lots of relevant traffic for a variety of different relevant topics Some of these articles would be very useful and relevant for visitors to our German website so i would like to get them translated and posted on our separate German language blog on our separate German website. The article text will not change much as the information is the same for Germany also How should i go about this without running into duplicate content issues with Google I looked into rel=alternate and realized that i cannot use this over two separate websites, i also thought about rel=canonical but it doesn't look like this would be suitable either Can anybody please give me any advice or thoughts on this ?
Intermediate & Advanced SEO | | Antony_Towle0 -
SEOMOZ crawler is still crawling a subdomain despite disallow
This is for our client with a subdomain. We only want to analyze their main website as this is the one we want to SEO. The subdomain is not optimized so we know it's bound to have lots of errors. We added the disallow code when we started and it was working fine. We only saw the errors for the main domain and we were able to fix them. However, just a month ago, the errors and warnings spiked up and the errors we saw were for the subdomain. As far as our web guys are concerned. the disallow code is still there and was not touched. User-agent: rogerbot Disallow: / We would like to know if there's anything we might have unintentionally changed or something we need to do so that the SEOMOZ crawler will stop going through the subdomain. Any help is greatly appreciated!
Intermediate & Advanced SEO | | TheNorthernOffice790 -
Will links still show in WMT after you disavow them?
Does anyone know a definitive answer to this? I'm thinking they will still show up in WMT links to your site? Anyone seen anything different? Thanks,
Intermediate & Advanced SEO | | Further
Chris0 -
Press Release in 3 PR6 Sites against yahoo submission
Hi - i have shortlisted 3 pr news sites - prlog.org, i-newswire.com, 24-7pressrelease.com . All 3 have pr6 and have high scores. Have 2 queries - a) Is it worth to put backlinks - by taking a paid press release option and spend around 200$ - sole intent is to generate backlinks b) is the combination effect of these 3 pr sites better than stand alone yahoo direct submission - as have limited budget can do either of them thnx
Intermediate & Advanced SEO | | Modi0 -
NOINDEX content still showing in SERPS after 2 months
I have a website that was likely hit by Panda or some other algorithm change. The hit finally occurred in September of 2011. In December my developer set the following meta tag on all pages that do not have unique content: name="robots" content="NOINDEX" /> It's been 2 months now and I feel I've been patient, but Google is still showing 10,000+ pages when I do a search for site:http://www.mydomain.com I am looking for a quicker solution. Adding this many pages to the robots.txt does not seem like a sound option. The pages have been removed from the sitemap (for about a month now). I am trying to determine the best of the following options or find better options. 301 all the pages I want out of the index to a single URL based on the page type (location and product). The 301 worries me a bit because I'd have about 10,000 or so pages all 301ing to one or two URLs. However, I'd get some link juice to that page, right? Issue a HTTP 404 code on all the pages I want out of the index. The 404 code seems like the safest bet, but I am wondering if that will have a negative impact on my site with Google seeing 10,000+ 404 errors all of the sudden. Issue a HTTP 410 code on all pages I want out of the index. I've never used the 410 code and while most of those pages are never coming back, eventually I will bring a small percentage back online as I add fresh new content. This one scares me the most, but am interested if anyone has ever used a 410 code. Please advise and thanks for reading.
Intermediate & Advanced SEO | | NormanNewsome0