How would you optimize a new site?
-
Hi guys, im here to ask based on your personal opinion.
We know in order to rank in SEO for a site is to make authority contents that interest people. But what would you do to increase your ranking of your site or maybe a blog post?
leaving your link on blogs comment seem dangerous, nowadays. Is social media the only way to go? Trying to get people to write about you? what else can be done?
-
First of all, all comment sections I have ever seen on blogs are nofollow links so they are really a waste of time in terms of SEO, in terms of driving traffic they are ok.
Spend time writing quality over quantity content, but more importantly when writing it have a list of news sites / blogs / contacts in mind who you are planning on outreaching the article too.
The hardest part is the outreach, anyone can write a great article, but you need to get it picked up and distributed. In an ideal world, you would all ready have made contact with the journalists before hand and see what articles they want.
IMO journalist are very busy and if you can provide them with a great article which happens to include a link to your site, if they are rushing to hit a deadline, then you have a higher % chance of it being accepted on their site.
Social media the way to go: depends on what industry you operate in some industries this could be a complete waste of time. Especially if you are a B2B company. Plus with Facebook's latest algorithm update its actually quite hard to get a social media post in front of your audience.
Biggest tip would be, quality articles over quantity and spend as much if not more time on outreach to relevant blogs, websites to get your content picked upto and linked to.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Competitor has same site with multiple languages
Hey Moz, I am working with a dating review website and we have noticed one of our competitors is basically making duplicated of their site with .com, .de, .co.uk, etc. My first thought is this is basically a way to game the system but I could be wrong. They are tapping into googles geo results by including major cities in each state, i.e. "dating in texas" "dating in atlanta" however the content itself doesn't really change. I can't figure out exactly why they are ranking so much higher. For example using some other SEO tools they have a traffic estimate of $500,000 monthly, where as we are sitting around $2000. So, either the traffic estimates are grossly misrepresenting traffic volume, OR they really are crushing it. TLDR: Is geo locating/translating sites a valid way to create backlinks? It's seems a lot like a PBN.
White Hat / Black Hat SEO | | HashtagHustler0 -
Duplicate content - multiple sites hosted on same server with same IP address
We have three sites hosted on the same server with the same IP address. For SEO (to avoid duplicate content) reasons we need to redirect the IP address to the site - but there are three different sites. If we use the "rel canonical" code on the websites, these codes will be duplicates too, as the websites are mirrored versions of the sites with IP address, e.g. www.domainname.com/product-page and 23.34.45.99/product-page. What's the best ways to solve these duplicate content issues in this case? Many thanks!
White Hat / Black Hat SEO | | Jade0 -
How to re-rank an established website with new content
I can't help but feel this is a somewhat untapped resource with a distinct lack of information.
White Hat / Black Hat SEO | | ChimplyWebGroup
There is a massive amount of information around on how to rank a new website, or techniques in order to increase SEO effectiveness, but to rank a whole new set of pages or indeed to 're-build' a site that may have suffered an algorithmic penalty is a harder nut to crack in terms of information and resources. To start I'll provide my situation; SuperTED is an entertainment directory SEO project.
It seems likely we may have suffered an algorithmic penalty at some point around Penguin 2.0 (May 22nd) as traffic dropped steadily since then, but wasn't too aggressive really. Then to coincide with the newest Panda 27 (According to Moz) in late September this year we decided it was time to re-assess tactics to keep in line with Google's guidelines over the two years. We've slowly built a natural link-profile over this time but it's likely thin content was also an issue. So beginning of September up to end of October we took these steps; Contacted webmasters (and unfortunately there was some 'paid' link-building before I arrived) to remove links 'Disavowed' the rest of the unnatural links that we couldn't have removed manually. Worked on pagespeed as per Google guidelines until we received high-scores in the majority of 'speed testing' tools (e.g WebPageTest) Redesigned the entire site with speed, simplicity and accessibility in mind. Htaccessed 'fancy' URLs to remove file extensions and simplify the link structure. Completely removed two or three pages that were quite clearly just trying to 'trick' Google. Think a large page of links that simply said 'Entertainers in London', 'Entertainers in Scotland', etc. 404'ed, asked for URL removal via WMT, thinking of 410'ing? Added new content and pages that seem to follow Google's guidelines as far as I can tell, e.g;
Main Category Page Sub-category Pages Started to build new links to our now 'content-driven' pages naturally by asking our members to link to us via their personal profiles. We offered a reward system internally for this so we've seen a fairly good turnout. Many other 'possible' ranking factors; such as adding Schema data, optimising for mobile devices as best we can, added a blog and began to blog original content, utilise and expand our social media reach, custom 404 pages, removed duplicate content, utilised Moz and much more. It's been a fairly exhaustive process but we were happy to do so to be within Google guidelines. Unfortunately, some of those link-wheel pages mentioned previously were the only pages driving organic traffic, so once we were rid of these traffic has dropped to not even 10% of what it was previously. Equally with the changes (htaccess) to the link structure and the creation of brand new pages, we've lost many of the pages that previously held Page Authority.
We've 301'ed those pages that have been 'replaced' with much better content and a different URL structure - http://www.superted.com/profiles.php/bands-musicians/wedding-bands to simply http://www.superted.com/profiles.php/wedding-bands, for example. Therefore, with the loss of the 'spammy' pages and the creation of brand new 'content-driven' pages, we've probably lost up to 75% of the old website, including those that were driving any traffic at all (even with potential thin-content algorithmic penalties). Because of the loss of entire pages, the changes of URLs and the rest discussed above, it's likely the site looks very new and probably very updated in a short period of time. What I need to work out is a campaign to drive traffic to the 'new' site.
We're naturally building links through our own customerbase, so they will likely be seen as quality, natural link-building.
Perhaps the sudden occurrence of a large amount of 404's and 'lost' pages are affecting us?
Perhaps we're yet to really be indexed properly, but it has been almost a month since most of the changes are made and we'd often be re-indexed 3 or 4 times a week previous to the changes.
Our events page is the only one without the new design left to update, could this be affecting us? It potentially may look like two sites in one.
Perhaps we need to wait until the next Google 'link' update to feel the benefits of our link audit.
Perhaps simply getting rid of many of the 'spammy' links has done us no favours - I should point out we've never been issued with a manual penalty. Was I perhaps too hasty in following the rules? Would appreciate some professional opinion or from anyone who may have experience with a similar process before. It does seem fairly odd that following guidelines and general white-hat SEO advice could cripple a domain, especially one with age (10 years+ the domain has been established) and relatively good domain authority within the industry. Many, many thanks in advance. Ryan.0 -
Site ranking position 1, after 1 day existing
Hi, I'm working in the Online gaming and since a few months there was a website called 'Htmlwijzer.nl', it ranked page 1 for 'online casino' in Google.nl which is of course a high competive keyword and remained there for about 2 months. This website didn't come up slowly in Google.nl from page 10 to 1, it just was there one day at page 1. This one only had HTML-related information content and did rank for 'online casino' with a layer ad on it's site which would only be showed to users in the Netherlands. Now that website received a penalty after 2 months, and since today a new site is in Google at position 1, called www.casinowijzer.nl. It has completely no backlinks (online a 301 from the former domain). It just popped up there.. Does anyone know how this website could have gotten here? It's obviously blackhat, but for a keyword like 'online casino' it's quite amazing. Thanks,
White Hat / Black Hat SEO | | iwebdevnl0 -
Methods for getting links to my site indexed?
What are the best practices for getting links to my site indexed in search engines. We have been creating content and acquiring backlinks for the last few months. They are not being found in the back link checkers or in the Open Site Explorer. What are the tricks of the trade for imporiving the time and indexing of these links? I have read about some RSS methods using wordpress sites but that seems a little shady and i am sure google is looking for that now. Look forward to your advice.
White Hat / Black Hat SEO | | devonkrusich0 -
Has anyone been able to recover a site from that was slapped by panda?
I have a client that the only thing I can determine is over optimization of a couple anchor terms which the person no longer ranks for..I tried mixing up with brandname , brandname.com and a diversity of links but nothing seems to budge anyone have a similar problem?
White Hat / Black Hat SEO | | foreignhaus0 -
Opinions Wanted: Links Can Get Your Site Penalized?
I'm sure by now a lot of you have had a chance to read the Let's Kill the "Bad Inbound Links Can Get Your Site Penalized" Myth over at SearchEngineJournal. When I initially read this article, I was happy. It was confirming something that I believed, and supporting a stance that SEOmoz has taken time and time again. The idea that bad links can only hurt via loss of link juice when they get devalued, but not from any sort of penalization, is indeed located in many articles across SEOmoz. Then I perused the comments section, and I was shocked and unsettled to see some industry names that I recognized were taking the opposite side of the issue. There seems to be a few different opinions: The SEOmoz opinion that bad links can't hurt except for when they get devalued. The idea that you wouldn't be penalized algorithmically, but a manual penalty is within the realm of possibility. The idea that both manual and algorithmic penalties were a factor. Now, I know that SEOmoz preaches a link building strategy that targets high quality back links, and so if you completely prescribe to the Moz method, you've got nothing to worry about. I don't want to hear those answers here - they're right, but they're missing the point. It would still be prudent to have a correct stance on this issue, and I'm wondering if we have that. What do you guys think? Does anybody have an opinion one way or the other? Does anyone have evidence of it being one way or another? Can we setup some kind of test, rank a keyword for an arbitrary term, and go to town blasting low quality links at it as a proof of concept? I'm curious to hear your responses.
White Hat / Black Hat SEO | | AnthonyMangia0 -
How much pain can I expect if I change the URL structure of the site again?
About 3 months ago I implemented a massive URL structure change by 'upgrading' some of the features of our CMS Prior to this URL's for catergorys and products looked something like this http://www.thefurnituremarket.co.uk/proddetail.asp?prod=OX09 I made a few changes but din't implement it fully as I felt it would be better to do it instages as the site was getting indexed more thouroughly. HOWEVER... We have just hit the first page for some key SERP's and I am wary to rock the boat again by changing the URL structures again and all the sitemaps. How much pain do you think we could feel if i went ahead and optimised the URL's fully? and What would you do? 🙂
White Hat / Black Hat SEO | | robertrRSwalters0