How to improve visibility of new content
-
What are best SEO practices to improve visibility in SERP for new content apart from meta data.
-
This is a long answer
To get content to rank you need to have a good plan in place before you write the content and then patience after.
Make sure your content is something that people are interested in and are willing to share before you actually write it. Good content actually starts with keyword and viral research. So for example lets say your site sells dog collars. Instead of writing all your content about dog collars, you can think in more broad terms like: Dog collars for training purposes, Things you need to know about puppies, dog skin irritations, do prong collars hurt or help training... Do keyword research to find what these terms can be. (use google ad words to estimate traffic).
Next use content explorer from AHREFS to see what content having to do with your keywords has been viral in the last 24hrs to 6 months.
For Example: If you type in dog collars it shows that in the past 24 hours an article about "16 things you need to know about getting a puppy." has been shared on Facebook 4.2k times and retweeted 82 times. Narrow your keyword list to the top content ideas.Next go to google, search your new keyword list to see what ranks for the them.
Next quickly use Moz Open Site Explorer to check the DA/ PA / and link and spam metrics of the guys on the front page of those results. Try to identify if there are sites you can knock out with a properly optimized piece and some links.
Narrow down your keyword list again based on your results and then you can run a keyword difficulty report on Moz. Run a FULL REPORT so you can see all the different variables that makes those sites rank. This can give you a strategy for what you will need to outrank what is already there.
Next start to write your content based on what is MOST SHARABLE SOCIALLY, what is MOST ATTAINABLE to rank for, and what shows decent traffic. Make sure you follow proper on page SEO. Use Moz On Page Grader to grade your content while you write it. Check the results. Adjust your content to get the best grade. You should also grade the competitor pages you are trying to outrank to make sure your on page SEO is at least as good or better than theirs.
Link Tip: see who the top sites are linking to and link to those resources or better ones. Research in Moz and Ahrefs and Majestic who is linking to the already ranking content. Try to see if there are ways to build links from those same places. Replicate what is making them rank, follows, no follows, as long as they are relevant are good.After you are happy with your content it's time to distribute it. This part is the hard part.
Make sure you use Google web master tools / search console to Fetch and Render the new page. Then submit it to index.
Start your Social Distribution Campaign by first posting to your sites social media pages.
Check your visibility for the article by seeing how much it is shared, liked, retweeted...
If you are posting to a businesses Facebook page you can see how many people have viewed the post. You may be surprised to see that not too many people see posts at first due to Facebook's sharing algorithm.
There are 2 ways to get more distribution on Facebook, the easy way is pay for it, promote/boost your post or run an add with your post pointing to your content, make sure you target the ad to your preferred audience.
The harder way is to get engagement on your post (likes, shares, and comments) in the first hour(s) your post is published.
If you run an ad you can track conversions on sales or goals with Faceebook's pixel so that you can see people coming back and purchasing after the first visit. This pixel will also place your post/ad in front of visitors after they leave your site if they did not complete a transaction.Hopefully you can get some traffic from here. Check your web analytics to see what networks sent better traffic. More conversions, more time on site, etc... Based on this data you can invest more time and money in promoting to that network as your target customers are more likely there than on the other networks.
Monitor the groups on these networks as well to see if your content can add value to the discussion in the groups. Word of caution DONT SPAM the groups it just looks bad for you in the long run. As with reputation management it is better to already be a part of the Group and have an established presence by sharing valuable content, so when you share content you're affiliated with as long as it adds value it wont be seen as SPAM and you won't get banned.
To see if Google has indexed your new page do a site:websiteURL search and check to see if your new page is listed.
Next search your target keyword and synonyms to see where you rank for those. Record your position.
Next you can use google to look for outreach points. Sites you can contact to do traditional link building. If your content is original and adds value to the conversation you have a better chance at getting good links. Don't worry if you don't get too many, this can be really hard depending on what you are writing about.
Check your Google Results positions regularly. (I do it every morning and again throughout the day).
Track your results and keep doing competitive research. As you learn more of what your competitors are doing, repeat your outreach process to get more exposure.
If you are a site with low authority you can have the most amazing content and still have trouble ranking or getting traffic. For example in the hair product space, L'Oreal will always have an easier time ranking a page than a small hair care brand. It can be done but it is a harder job because of the trust google places on Authority. To help rank less authority sites you need social presence and social importance.
Hope this helps. Make sure you track all of this in a spreadsheet of some sort, so the next time you launch a new piece of content you can just follow a blue print. MOZ's white board Fridays cover how to do a lot of this and Ahrefs has a great series called over simplified SEO that talks about all this in easy to understand terms. Let me know if you have any questions.
Best,
Erick -
If this is good content that is highly sharable then I would start by promoting it prominently on every page of my website. If you use social get it out there with a carefully crafted pitch.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Possible duplicate content issue
Hi, Here is a rather detailed overview of our problem, any feedback / suggestions is most welcome. We currently have 6 sites targeting the various markets (countries) we operate in all websites are on one wordpress install but are separate sites in a multisite network, content and structure is pretty much the same barring a few regional differences. The UK site has held a pretty strong position in search engines the past few years. Here is where we have the problem. Our strongest page (from an organic point of view) has dropped off the search results completely for Google.co.uk, we've picked this up through a drop in search visibility in SEMRush, and confirmed this by looking at our organic landing page traffic in Google Analytics and Search Analytics in Search Console. Here are a few of the assumptions we've made and things we've checked: Checked for any Crawl or technical issues, nothing serious found Bad backlinks, no new spammy backlinks Geotarggetting, this was fine for the UK site, however the US site a .com (not a cctld) was not set to the US (we suspect this to be the issue, but more below) On-site issues, nothing wrong here - the page was edited recently which coincided with the drop in traffic (more below), but these changes did not impact things such as title, h1, url or body content - we replaced some call to action blocks from a custom one to one that was built into the framework (Div) Manual or algorithmic penalties: Nothing reported by search console HTTPs change: We did transition over to http at the start of june. The sites are not too big (around 6K pages) and all redirects were put in place. Here is what we suspect has happened, the https change triggered google to re-crawl and reindex the whole site (we anticipated this), during this process, an edit was made to the key page, and through some technical fault the page title was changed to match the US version of the page, and because geotargetting was not turned on for the US site, Google filtered out the duplicate content page on the UK site, there by dropping it off the index. What further contributes to this theory is that a search of Google.co.uk returns the US version of the page. With country targeting on (ie only return pages from the UK) that UK version of the page is not returned. Also a site: query from google.co.uk DOES return the Uk version of that page, but with the old US title. All these factors leads me to believe that its a duplicate content filter issue due to incorrect geo-targetting - what does surprise me is that the co.uk site has much more search equity than the US site, so it was odd that it choose to filter out the UK version of the page. What we have done to counter this is as follows: Turned on Geo targeting for US site Ensured that the title of the UK page says UK and not US Edited both pages to trigger a last modified date and so the 2 pages share less similarities Recreated a site map and resubmitted to Google Re-crawled and requested a re-index of the whole site Fixed a few of the smaller issues If our theory is right and our actions do help, I believe its now a waiting game for Google to re-crawl and reindex. Unfortunately, Search Console is still only showing data from a few days ago, so its hard to tell if there has been any changes in the index. I am happy to wait it out, but you can appreciate that some of snr management are very nervous given the impact of loosing this page and are keen to get a second opinion on the matter. Does the Moz Community have any further ideas or insights on how we can speed up the indexing of the site? Kind regards, Jason
Intermediate & Advanced SEO | | Clickmetrics0 -
New Alternate View Redirection
Hi, We are merging two sites, differentiated by the type of customer (consumer or corporate). Currently we have:
Intermediate & Advanced SEO | | seoeuroflorist
www.consumersite.com/product/
www.corporatesite.com/product/ When on the new site, the type of customer can be switched by clicking 'Corporate' or 'Customer' which adds ?user=Business or ?user=Private to the url which then redirects so the URL is the same but certain features have changed. We block ?user=Business and ?user=Private in URLs in robots.txt to prevent duplicating pages. Should we redirect like: www.corporatesite.com/product/ -> www.consumersite.com/product/ Or: www.corporatesite.com/product/ -> www.consumersite.com/product?user=Business (this will then redirect but the parameter is blocked by robots.) I'm concerned redirecting to a URL that is blocked from indexing is an obvious error. Any ideas are welcome. Thanks!0 -
Scraped content ranking above the original source content in Google.
I need insights on how “scraped” content (exact copy-pasted version) rank above the original content in Google. 4 original, in-depth articles published by my client (an online publisher) are republished by another company (which happens to be briefly mentioned in all four of those articles). We reckon the articles were re-published at least a day or two after the original articles were published (exact gap is not known). We find that all four of the “copied” articles rank at the top of Google search results whereas the original content i.e. my client website does not show up in the even in the top 50 or 60 results. We have looked at numerous factors such as Domain authority, Page authority, in-bound links to both the original source as well as the URLs of the copied pages, social metrics etc. All of the metrics, as shown by tools like Moz, are better for the source website than for the re-publisher. We have also compared results in different geographies to see if any geographical bias was affecting results, reason being our client’s website is hosted in the UK and the ‘re-publisher’ is from another country--- but we found the same results. We are also not aware of any manual actions taken against our client website (at least based on messages on Search Console). Any other factors that can explain this serious anomaly--- which seems to be a disincentive for somebody creating highly relevant original content. We recognize that our client has the option to submit a ‘Scraper Content’ form to Google--- but we are less keen to go down that route and more keen to understand why this problem could arise in the first place. Please suggest.
Intermediate & Advanced SEO | | ontarget-media0 -
Duplicate content - how to diagnose duplicate content from another domain before publishing pages?
Hi, 🙂 My company is having new distributor contract, and we are starting to sell products on our own webshop. Bio-technology is an industry in question and over 1.000 products. Writing product description from scratch would take many hours. The plan is to re-write it. With permission from our contractors we will import their 'product description' on our webshop. But, I am concerned being penalies from Google for duplicate content. If we re-write it we should be fine i guess. But, how can we be sure? Is there any good tool for comparing only text (because i don't want to publish the pages to compare URLs)? What else should we be aware off beside checking 'product description' for duplicate content? Duplicate content is big issue for all of us, i hope this answers will be helpful for many of us. Keep it hard work and thank you very much for your answers, Cheers, Dusan
Intermediate & Advanced SEO | | Chemometec0 -
Duplicate content question
Hi there, I work for a Theater news site. We have an issue where our system creates a chunk of duplicate content in Google's eyes and we're not sure how best to solve. When an editor produces a video, it simultaneously 1) creates a page with it's own static URL (e.g. http://www.theatermania.com/video/mary-louise-parker-tommy-tune-laura-osnes-and-more_668.html); and 2) displays said video on a public index page (http://www.theatermania.com/videos/). Since the content is very similar, Google sees them as duplicate. What should we do about this? We were thinking that one solution would to be dynamically canonicalize the index page to the static page whenever a new video is posted, but would Google frown on this? Alternatively, should we simply nofollow the index page? Lastly, are there any solutions we may have missed entirely?
Intermediate & Advanced SEO | | TheaterMania0 -
The use of subdomains to improve SEO?
A clients website which provide a number of trade services which have a page for each service they provide for example: carpentry or electrician or plumbing etc. currently these pages are found at domain.co.uk/bathrooms/ bathrooms.html I am trying to optmise each page better as they are competing with other sites who for example sell bathrooms rather than bathroom installers or plumbers. As part of the on page optimisation I plan to change the page names and directory structure. I had an idea to split the website down into subdomains for various sections i.e for all their services Create a sub domain such as http://plumber.domain.co.uk 2.) upload the relevant content (in this example the plumbing page) to the sub domain location 3.) correct all the links to absolute URLs for each sub domain / Will this help target better use of keywords in the URL in terms of SEO efforts ? hope it makes sense thanks Darren
Intermediate & Advanced SEO | | Bristolweb0 -
Duplicate Content Question
My client's website is for an organization that is part of a larger organization - which has it's own website. We were given permission to use content from the larger organization's site on my client's redesigned site. The SEs will deem this as duplicate content, right? I can "re-write" the content for the new site, but it will still be closely based on the original content from the larger organization's site, due to the scientific/medical nature of the subject material. Is there a way around this dilemma so I do not get penalized? Thanks!
Intermediate & Advanced SEO | | Mills1 -
New Site: Use Aged Domain Name or Buy New Domain Name?
Hi,
Intermediate & Advanced SEO | | peterwhitewebdesign
I have the opportunity to build a new website and use a domain name that is older than 5 years or buy a new domain name. The aged domain name is a .net and includes a keyword.
The new domain would include the same keyword as well as the U.S. state abbreviation. Which one would you use and why? Thanks for your help!0