Big hit to traffic a while ago, and slow recovery. Is there anything we've missed?
-
We took a big hit to our organic traffic when we implemented an HTML form which included a list of every country in the world, twice.
This rolled out onto every page on our website. And it got indexed by Google (webmaster tools showed our content keywords as being those from the form occurring 9000+ times on the site)
We've fixed this and the content keywords are back to normal, however our traffic has not yet fully recovered.
Is there anything on our site that you think could be sending spam signals to Google, or could be impeding our organic traffic growth?
-
Amy,
I'm sorry you have to deal with a snafu like this.
I noticed that the content most relevant to the keyword on the page tends to be pushed down below what reads like "SEO copy". For example, on the page about moving from the UK to Adelaide, Australia the entire first screen's worth of content is general info about Adelaide, and you have to scroll down pretty far to get to anything specifically about the UK. Another example is on the country-level pages (e.g. /spain, /thailand) the "Top Cities" with their score-bar and relevant links are pushed down below several long paragraphs about the country. Perhaps your users are already aware of the basic information about the country if they want to move there, and the more visually-appealing and helpful "Top Cities" area should be moved above? If you're worried about the SEO ramifications of moving the text content down you could just try this out as a test on a few country landing pages.
Let's think about the visitor for a minute. If I searched for "Moving to Dallas from the UK" and landed on this page I would not be happy: http://www.movehub.com/usa/dallas/move-to-dallas-from-uk . I would be looking for information on how to move to Dallas from the UK, things like visa requirements, good Vs bad neighborhoods, cost of living, job market, etc... Instead, all I see on my screen is a lot of fluff copy about how Texas isn't all about oil production anymore, and sidebar links to content about international container shipping costs, and moving to cities in Australia and Canada. The site needs a more intelligent way to show relevant content and links on a per-page basis. Some examples: Instead of showing links to Adelaide, Brisbane, Perth, Sydney, Toronto... on that page, show the cities closest to Dallas, like San Antonio, Austin, Houston... Instead of showing the general "about Texas" type content at the top, move it down or get rid of it alltogether and show the content that was once hidden below the fold up top: Moving To Dallas from the UK, and Comparing Dallas Vs London. Put in some links to pages about things like visa requirements from the UK to the US, and some job-search assistance (top employers in Dallas?) and you'll have a much more useful page.
More specific to the problem you experienced, however, if the content is no longer on the page it may just take Google some time to recrawl all of the old URLs again and see the updated content without all of the "keyword stuffing" they may have misunderstood. My advise would be to refresh your XML sitmap with new lastmod dates and resubmit it to entice Google to recrawl the pages again and see that the excessive keyword use has been fixed.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO'ing a sports advice website
Hi Team Moz, Despite being in tech/product development for 10+ years, I'm relatively new to SEO (and completely new to this forum) so was hoping for community advice before I dive in to see how Google likes (or perhaps doesn't) my soon to be built content. I'm building a site (BetSharper, an early-stage work in progress) that will deliver practical, data orientated predictive advice prior to sporting events commencing. The initial user personas I am targeting would need advice on specific games so, as an example, I would build a specific page for the upcoming Stanley Cup Game 1 between the Capitals and the Tampa Bay Lighting. I'm in the midst of keyword research and believe I have found some easier to achieve initial keywords (I'm realistic, building my DA will take time!) that include the team names but don't reference dates or state of the tournament. The question is, hypothetically if I ranked for this page for this sporting event this year, would it make sense to refresh the same page with 2019 matchup content when they meet again next year, or create a new page? I am assuming I would be targeting the same intended keywords but wondering if I get google credit for 2018 engagement post 2019 refresh. Or should I start fresh with a new page and specifically target keywords afresh each time? I read some background info on canonical tabs but wasn't sure if it was relevant in my case. I hope I've managed to articulate myself on what feels like an edge case within the wonderful world of SEO. Any advice the community delivers would be much appreciated...... Kind Regards James.
Intermediate & Advanced SEO | | JB19770 -
Why isn't Google caching our pages?
Hi everyone, We have a new content marketing site that allows anyone to publish checklists. Each checklist is being indexed by Google, but Google is not storing a cached version of any of our checklists. Here's an example:
Intermediate & Advanced SEO | | Checkli
https://www.checkli.com/checklists/ggc/a-girls-guide-to-a-weekend-in-south-beach Missing Cache:
https://webcache.googleusercontent.com/search?q=cache:DfFNPP6WBhsJ:https://www.checkli.com/checklists/ggc/a-girls-guide-to-a-weekend-in-south-beach+&cd=1&hl=en&ct=clnk&gl=us Why is this happening? How do we fix it? Is this hurting the SEO of our website.0 -
Will disallowing URL's in the robots.txt file stop those URL's being indexed by Google
I found a lot of duplicate title tags showing in Google Webmaster Tools. When I visited the URL's that these duplicates belonged to, I found that they were just images from a gallery that we didn't particularly want Google to index. There is no benefit to the end user in these image pages being indexed in Google. Our developer has told us that these urls are created by a module and are not "real" pages in the CMS. They would like to add the following to our robots.txt file Disallow: /catalog/product/gallery/ QUESTION: If the these pages are already indexed by Google, will this adjustment to the robots.txt file help to remove the pages from the index? We don't want these pages to be found.
Intermediate & Advanced SEO | | andyheath0 -
Duplicate Content through 'Gclid'
Hello, We've had the known problem of duplicate content through the gclid parameter caused by Google Adwords. As per Google's recommendation - we added the canonical tag to every page on our site so when the bot came to each page they would go 'Ah-ha, this is the original page'. We also added the paramter to the URL parameters in Google Wemaster Tools. However, now it seems as though a canonical is automatically been given to these newly created gclid pages; below https://www.google.com.au/search?espv=2&q=site%3Awww.mypetwarehouse.com.au+inurl%3Agclid&oq=site%3A&gs_l=serp.3.0.35i39l2j0i67l4j0i10j0i67j0j0i131.58677.61871.0.63823.11.8.3.0.0.0.208.930.0j3j2.5.0....0...1c.1.64.serp..8.3.419.nUJod6dYZmI Therefore these new pages are now being indexed, causing duplicate content. Does anyone have any idea about what to do in this situation? Thanks, Stephen.
Intermediate & Advanced SEO | | MyPetWarehouse0 -
Why isn't my site being indexed by Google?
Our domain was originally pointing to a Squarespace site that went live in March. In June, the site was rebuilt in WordPress and is currently hosted with WPEngine. Oddly, the site is being indexed by Bing and Yahoo, but is not indexed at all in Google i.e. site:example.com yields nothing. As far as I know, the site has never been indexed by Google, neither before nor after the switch. What gives? A few things to note: I am not "discouraging search engines" in WordPress Robots.txt is fine - I'm not blocking anything that shouldn't be blocked A sitemap has been submitted via Google Webmaster Tools and I have "fetched as Google" and submitted for indexing - No errors I've entered both the www and non-www in WMT and chose a preferred There are several incoming links to the site, some from popular domains The content on the site is pretty standard and crawlable, including several blog posts I have linked up the account to a Google+ page
Intermediate & Advanced SEO | | jtollaMOT0 -
Why isn't my uneven link flow among index pages causing uneven search traffic?
I'm working with a site that has millions of pages. The link flow through index pages is atrocious, such that for the letter A (for example) the index page A/1.html has a page authority of 25 and the next pages drop until A/70.html (the last index page listing pages that start with A) has a page authority of just 1. However, the pages linked to from the low page authority index pages (that is, the pages whose second letter is at the end of the alphabet) get just as much traffic as the pages linked to from A/1.html (the pages whose second letter is A or B). The site gets a lot of traffic and has a lot of pages, so this is not just a statistical biip. The evidence is overwhelming that the pages from the low authority index pages are getting just as much traffic as those getting traffic from the high authority index pages. Why is this? Should I "fix" the bad link flow problem if traffic patterns indicate there's no problem? Is this hurting me in some other way? Thanks
Intermediate & Advanced SEO | | GilReich0 -
It's a good idea to have a directory on your website?
Currently I have a directory on a sub domain but Google apparently sees it as part of my main domain so all outgoing links may be affecting my rankings?
Intermediate & Advanced SEO | | Valarlf0 -
New Website Launch - Traffic Way Down
We launched a new website in June. Traffic plummeted after the launch, we crept back up for a couple of months, but now we are flat, nowhere near our pre-launch traffic or previous year's traffic. For the past 6 months our analytics have been worrying us - Overall traffic and new visitor traffic is down over 10%, bounce rate is up almost 35% since site launched, keywords aren't ranking where they used to, and of course, web sales are down. Is this supposed to happen when a new site is launched, and how long does a new this transition last? We have done all the technical audits, adding relevant content, we're at a loss. Any suggestions where to look next to improve traffic to pre-launch numbers?
Intermediate & Advanced SEO | | WaySEO0