Abused seo unintentionally, now need a way out
-
Hello,
I have been in contact with a smo to optimize my site for search engines and social media sites. my site was doing great from last 4 years.
but suddenly it started dropping in ranking. then i came and joined seomoz pro to find a way out.
i was suggested to categories content in form of subdomains ... well that put a huge toll on my rankings.. thanks to suggestions here i have 301 them to sub directories.
Now another huge question arises. i found out that my smo guy was taking artificial votes or whatever youc all them on twitter, facebook and g+ ...twitter and facebook's are understandable but i am getting to think that these votings on g+ might have affected my site's ranking ?
here is a sample url http://www.designzzz.com/cutest-puppy-pictures-pet-photography-tips/
if you scroll below you will see 56 google plus 1s...
now the big question is, i have been creating genuince content. but nowt hat i am stuck in this situation, how to get out of it ?
changing urls will be bad for readers.. will a 301 will fix it ? or any other method.
thanks in advance
-
_I have checked some of the posts of the Home page and I see certain patterns and this is quite predictable:
1. Most of the posts are basically complication of images already available in other sites. Though the content is unique, the images are nothing new.
2. I have found two content heavy posts in the home page and what I believe that they are guest posts. Now, guest blogging is great but what I mean to say is that you should not be too dependent on guest blogging here. You need to come up with two or three posts [I prefer content heavy posts] before__3. Seems like you are publishing reviews. Make sure if these reviews are paid, you should be adding no follow tag against them. Though it is the linked sites that should be penalized for this, I prefer to be safe than sorry.
I know this may sound a bit harsh. I hope you would not mind. _
-
No problem friend. You are most welcome.
-
Great, thanks alot dude :}
-
That's true Ayaz. 301 does not serve the purpose in this case. As you have stopped this, sit tight and let us see how the 301 from the sub-domains to the sub-directories setup works. As far as my experience goes, adding new, unique, original and fresh content will speed up the process to come out of a penalty if there has been applied any.
Wish you all the best.
Best regards,
Devanur Rafi.
-
Devanur,
I got this stopped couple months ago as i wasn't feeling comfortable with that at all.
At firs ti thought of adding a 301 to new urls. but then i noticed. 301 mvoes the g+ count as well.. so ofcourse it won't work.
oh well lets see how it goes now.
thanks for all the help
-
Ayaz, by now you should have asked those guys to stop this thing. If you have not, please do so quickly. Coming to undoing these, even I don't have a clue. Most of these would be some fake accounts or accounts created for only this purpose. The best thing you can do as of now is to stop this from growing further. Moreover, we are not sure about if these inflated social signals are the cause for the drop in the rankings. Gut feeling says, there is nothing that you should be worried about and these are not the ones responsible for the loss. Having that said, please stop this immediately and consult your SEO guy who did this and find out what he has to say about this.
Best regards,
Devanur Rafi.
-
Thanks Devanur.
But this leaves me clueless now what should i do :{
i need to undo these penalities i am suffering from.
-
Hi Ayaz,
Though I cannot tie a knot with a proof between the drop in your rankings and the artificially inflated social signals in this case, I can definitely say one thing that Google can sniff things like these very easily and it would be too late before you even realize the axe coming down on you. As we know that social signals have strong impact on the search engine rankings in certain niches, we should not try to play with these with an intention to get a boost in SERPs. Next time when you hire an SEO or SMO please make sure to get a list of deliverables from them. Black hat or acts of cheating search engines might work temporarily but you should run away from them by all means if you want to be safe.
Best regards,
Devanur Rafi.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
The Importance of Bold Keywords in SEO?
Hi all, Recently I came cross an RV lifestyle blog named RVing Trends. The website features high-quality contents about handy RV camping tips & guides, and in-depth RV product reviews. They seem to spend a lot of effort on the content quality. I've followed this website for a few months and can see they've been producing 3,000-5,000 word length contents regularly. One thing I notice is that they emphasize the main keyword as bold in almost the posts. You can check 1 sample here about RV mattress reviews. Just want to ask for your opinions about the efficiency of this technique and is the keyword density still important for blog content to rank well in Google. Thanks!
White Hat / Black Hat SEO | | TungNM1930 -
Infinite Scrolling on Publisher Sites - is VentureBeat's implementation really SEO-friendly?
I've just begun a new project auditing the site of a news publisher. In order to increase pageviews and thus increase advertising revenue, at some point in the past they implemented something so that as many as 5 different articles load per article page. All articles are loaded at the same time and from looking in Google's cache and the errors flagged up in Search Console, Google treats it as one big mass of content, not separate pages. Another thing to note is that when a user scrolls down, the URL does in fact change when you get to the next article. My initial thought was to remove this functionality and just load one article per page. However I happened to notice that VentureBeat.com uses something similar. They use infinite scrolling so that the other articles on the page (in a 'feed' style) only load when a user scrolls to the bottom of the first article. I checked Google's cached versions of the pages and it seems that Google also only reads the first article which seems like an ideal solution. This obviously has the benefit of additionally speeding up loading time of the page too. My question is, is VentureBeat's implementation actually that SEO-friendly or not. VentureBeat have 'sort of' followed Google's guidelines with regards to how to implement infinite scrolling https://webmasters.googleblog.com/2014/02/infinite-scroll-search-friendly.html by using prev and next tags for pagination https://support.google.com/webmasters/answer/1663744?hl=en. However isn't the point of pagination to list multiple pages in a series (i.e. page 2, page 3, page 4 etc.) rather than just other related articles? Here's an example - http://venturebeat.com/2016/11/11/facebooks-cto-explains-social-networks-10-year-mission-global-connectivity-ai-vr/ Would be interesting to know if someone has dealt with this first-hand or just has an opinion. Thanks in advance! Daniel
White Hat / Black Hat SEO | | Daniel_Morgan1 -
Are links on sites that require PAD files good or bad for SEO?
I want to list our product on a number of sites that require PAD files such as Software Informer and Softpedia. Is this a good idea from an SEO perspective to have links on these pages?
White Hat / Black Hat SEO | | SnapComms0 -
Are businesses still hiring SEO that use strategies that could lead to a Google penalty?
Is anyone worried that businesses know so little about SEO that they are continuing to hire SEO consultants that use strategies that could land the website with a Google penalty? I ask because we did some research with businesses and found the results worrying: blog farms, over optimised anchor text. We will be releasing the data later this week, but wondered if it something for the SEO community to worry about and what can be done about it.
White Hat / Black Hat SEO | | williamgoodseoagency.com0 -
Can anyone explain some below SEO questions ?
Can we do link building like directory, article, press releases, classifieds, business listing, social bookmarking etc. We need to check DA, Alexa, Page Rank, cBlock IP before publishing any kind of Link but how much Max or Min. no. should be for consider any website like DA should be min 20-30-40 etc.. How can consider a natural links? Which type anchor text should be in any kind of links may be directory etc. In website interlinking we should put Exact Links or no need to put any links For.ex.my website is abc.com.au then we can put link for Website Design keywords or Should be long tail keyword. How can we do content marketing means we should post blog in internal website or need to create External Blog like BlogSpot, WordPress. In blog we should put any keyword link OR should be post without links. We can put link on no follow website. Why more website coming on Google first page but they are doing Spammy links like exact keywords links, unnatural links etc.. Thanks, Akhilesh
White Hat / Black Hat SEO | | dotlineseo0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Off-page SEO and link building
Hi everyone! I work for a marketing company; for one of our clients' sites, we are working with an independent SEO consultant for on-page help (it's a large site) as well as off-page SEO. Following a meeting with the consultant, I had a few red flags with his off-page practices – however, I'm not sure if I'm just inexperienced and this is just "how it works" or if we should shy away from these methods. He plans to: guest blog do press release marketing comment on blogs He does not plan to consult with us in advance regarding the content that is produced, or where it is posted. In addition, he doesn't plan on producing a report of what was posted where. When I asked about these things, he told me they haven't encountered any problems before. I'm not saying it was spam-my, but I'm more not sure if these methods are leaning in the direction of "growing out of date," or the direction of "black-hat, run away, dude." Any thoughts on this would be crazy appreciated! Thanks, Casey
White Hat / Black Hat SEO | | CaseyDaline0 -
How do you keep a record of your onsite SEO changes
Hi Everyone, I'm new to the whole SEO process, so was wondering if anyone can help me. I want to keep a record of all SEO activities in one place for the website i'm trying to optimise for. I have created an excel sheet which have the follwoing tabs -Overview & Rankings
White Hat / Black Hat SEO | | mcliddy
- Keyword Research Competitior Analysis
- Keyword Distribution Map Onpage SEO Link Ideas Link Research
-Link Building Log
- PPC Campaign Does this all seem correct?
Could anyone help in telling me what process you do to keep a record of all SEO onsite activity? I hope this isn't a stupid post, but help would be very much appreciated Many Thanks Matt0