Switching site content
-
I have been advised to take a particular path with my domain, to me it seems "black hat" but ill ask the experts:
Is it acceptable when one owns an exact match location domain eg london.com, to run as a tourist information site, gathering links from wikipedia,bbc,local paper/radio/sports websites etc, then after 6 - 12 months, switch the content to a business site?
What could the penalties be?
Please advise...
-
Wow some great answers.
Thankyou.
Im in such a predicament now,I intended to keep the content and have a local section on the site with local events and local forum etc, thus also retaining a community around.
The anchor text i was going to build during the non-business phase was placename.
And as i want to build the new business as a brand its new anchor text would also be placename.
I am eventually wanting to pursue seoplacename, but i do have a lot of learning & practice to do yet (1-2 years) so, I wanted to build a "site age" to go with the 20 year domain registered age(is this relevant to rankings?).
Or would i be left with the wrong type of community...
-
The problem with "bait and switch" is that it doesn't really work with content and relevancy signals, including anchor text.
Google has filed several patents over the years showing how they can devalue anchor text when they see changes in context. A broad example would be building links to a site about "red pickups" and then changing everything to "tom hanks". In this situation, you're likely not going to rank near as well for "Tom Hanks" after the switch as you did for "red pickups" so you likely wasted a lot of link equity.
That said, this technique does work "a little". That's why you see black-hatters buying old websites and throwing up spam. The new pages can sometimes get a little traffic based on authority and link juice, but in my opinion the ROI is so small, and the risks large, that it's not worth the effort.
A better strategy, in my opinion, is to build out your business website in full view of the world, and attract links that are related to your industry in creative and surprising ways. Instead of bait and switch, surprise and delight people with the unexpected. This way you retain 100% of any link equity you build and the rewards pay off greater down the road.
-
You could possibly get away with it it if the topic of the new site was in some way related to the content on the old site (the closer the better). However, the further away the new site topic gets from the context surrounding the back links that were created for the old site, the less less value it's going to get from them.
Regardless of your domain name, it still takes effort to build solid links. Going through the exercise of getting good links makes the company a better company, a better competitor, and a better search result and will most likely give a better ROI than a bait and switch tactic.
-
I wonder the same thing. It's way easier to build links for a non-commercial site. If you were to bait and switch, I'd think you'd want to keep the basic content that was linked to to be fair to those who linked to you.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to re-rank an established website with new content
I can't help but feel this is a somewhat untapped resource with a distinct lack of information.
White Hat / Black Hat SEO | | ChimplyWebGroup
There is a massive amount of information around on how to rank a new website, or techniques in order to increase SEO effectiveness, but to rank a whole new set of pages or indeed to 're-build' a site that may have suffered an algorithmic penalty is a harder nut to crack in terms of information and resources. To start I'll provide my situation; SuperTED is an entertainment directory SEO project.
It seems likely we may have suffered an algorithmic penalty at some point around Penguin 2.0 (May 22nd) as traffic dropped steadily since then, but wasn't too aggressive really. Then to coincide with the newest Panda 27 (According to Moz) in late September this year we decided it was time to re-assess tactics to keep in line with Google's guidelines over the two years. We've slowly built a natural link-profile over this time but it's likely thin content was also an issue. So beginning of September up to end of October we took these steps; Contacted webmasters (and unfortunately there was some 'paid' link-building before I arrived) to remove links 'Disavowed' the rest of the unnatural links that we couldn't have removed manually. Worked on pagespeed as per Google guidelines until we received high-scores in the majority of 'speed testing' tools (e.g WebPageTest) Redesigned the entire site with speed, simplicity and accessibility in mind. Htaccessed 'fancy' URLs to remove file extensions and simplify the link structure. Completely removed two or three pages that were quite clearly just trying to 'trick' Google. Think a large page of links that simply said 'Entertainers in London', 'Entertainers in Scotland', etc. 404'ed, asked for URL removal via WMT, thinking of 410'ing? Added new content and pages that seem to follow Google's guidelines as far as I can tell, e.g;
Main Category Page Sub-category Pages Started to build new links to our now 'content-driven' pages naturally by asking our members to link to us via their personal profiles. We offered a reward system internally for this so we've seen a fairly good turnout. Many other 'possible' ranking factors; such as adding Schema data, optimising for mobile devices as best we can, added a blog and began to blog original content, utilise and expand our social media reach, custom 404 pages, removed duplicate content, utilised Moz and much more. It's been a fairly exhaustive process but we were happy to do so to be within Google guidelines. Unfortunately, some of those link-wheel pages mentioned previously were the only pages driving organic traffic, so once we were rid of these traffic has dropped to not even 10% of what it was previously. Equally with the changes (htaccess) to the link structure and the creation of brand new pages, we've lost many of the pages that previously held Page Authority.
We've 301'ed those pages that have been 'replaced' with much better content and a different URL structure - http://www.superted.com/profiles.php/bands-musicians/wedding-bands to simply http://www.superted.com/profiles.php/wedding-bands, for example. Therefore, with the loss of the 'spammy' pages and the creation of brand new 'content-driven' pages, we've probably lost up to 75% of the old website, including those that were driving any traffic at all (even with potential thin-content algorithmic penalties). Because of the loss of entire pages, the changes of URLs and the rest discussed above, it's likely the site looks very new and probably very updated in a short period of time. What I need to work out is a campaign to drive traffic to the 'new' site.
We're naturally building links through our own customerbase, so they will likely be seen as quality, natural link-building.
Perhaps the sudden occurrence of a large amount of 404's and 'lost' pages are affecting us?
Perhaps we're yet to really be indexed properly, but it has been almost a month since most of the changes are made and we'd often be re-indexed 3 or 4 times a week previous to the changes.
Our events page is the only one without the new design left to update, could this be affecting us? It potentially may look like two sites in one.
Perhaps we need to wait until the next Google 'link' update to feel the benefits of our link audit.
Perhaps simply getting rid of many of the 'spammy' links has done us no favours - I should point out we've never been issued with a manual penalty. Was I perhaps too hasty in following the rules? Would appreciate some professional opinion or from anyone who may have experience with a similar process before. It does seem fairly odd that following guidelines and general white-hat SEO advice could cripple a domain, especially one with age (10 years+ the domain has been established) and relatively good domain authority within the industry. Many, many thanks in advance. Ryan.0 -
Whether to use new domain or old ecommerce site domain that has been incomplete for a long time.
Hello, We are starting a second store in our niche. Which of the following should I choose: A. We have a site from a year and a half ago that we put content on but never actually added products. The category and article content needs to be completely rewritten. We will completely rewrite the content to be much better and up to date. We're planning on adding products and rewriting the manufacturer descriptions. B. We could use a new domain that is closer to exact match for our main keyword. We'd just buy one for $15 I don't know whether A or B would be the fastest way to get the site going. I'm concerned that leaving a site half done for a year could cause an issue, but I really don't know. If you've got experience with this, please advise. Thank you.
White Hat / Black Hat SEO | | BobGW0 -
How would you optimize a new site?
Hi guys, im here to ask based on your personal opinion. We know in order to rank in SEO for a site is to make authority contents that interest people. But what would you do to increase your ranking of your site or maybe a blog post? leaving your link on blogs comment seem dangerous, nowadays. Is social media the only way to go? Trying to get people to write about you? what else can be done?
White Hat / Black Hat SEO | | andzon0 -
20-30% of our ecommerce categories contain no extra content, could this be a problem
Hello, About 20-30% of our ecommerce categories have no content beyond the products that are in them. Could this be a problem with Panda? Thanks!
White Hat / Black Hat SEO | | BobGW0 -
Content optimized for old keywords and G Updates
Hi, We've got some old content, about 50 pages worth in an Ecommerce site, that is optimized for keywords that aren't the subject of the page - these keywords occur about 8 times (2 keywords per page) in the old content. We are going through these 50 pages and changing the title, H1, and meta description tag to match the exact subject of the page - so that we will increase in rankings again - the updates have been lowering our rankings. Do we need to completely rewrite the content for these 50 pages, or can we just sprinkle it with any needed additions of the one keyword that is the subject of the page? The reason I'm asking is that our rankings keep dropping and these 50 pages seem to be part of the problem. We're in the process of updating these 50 pages Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Untrusted site - malware!
I recently had my link profile done as I was badly effected by something in 2012 (Penguin, Panda.. who knows? never got a message from google in webmaster about anything). Loads of INBOUND links were identified as being 'dodgy'' and the person highlighted them in different colors. However, another seo éxpert' told me to leave them (perhaps remove just 3 of them) and don't bother with the rest. Now I am not sure what to do? Any opinions? RED
White Hat / Black Hat SEO | | Llanero
3 were highlighted as being from untrusted malware. I think I should disavow them but really, would 3 make that much difference for a fall in my site? ORANGE
240 were said to be spam articles and I was advised:
The following pages highlighted in orange are on sites created for the purpose of publishing articles for link building. Since the same articles appear on multiple sites, Google views this as duplicate content. Links to Monteverde Tours in these articles should be removed or tagged "nofollow." Where this is not possible, the domains should be disavowed. YELLOW
85 were said to be from Low-quality directories
The following pages highlighted in yellow are on low-quality directories and link farms. Links to Monteverde Tours on these pages should be removed or the domains disavowed. GREEN
340 were said to be from sites were the page was not found , Account suspended, Problem loading page, Link removed, domain expired
The following pages highlighted in green include pages whose links to Monteverde Tours have been removed and pages that were inaccessible for various reasons, as shown in the Comments column. These pages or their domains should be disavowed to remove them from the Google index. I have read (and asked on this forum) about disavow but the more I read the more I am getting confused about the next action. I tried for one year to get rid of any bad outbound links, did blogging, social media, improved content, landing pages etc but all to no avail. Any opinions appreciated. I am not looking for a magic bullet, I know there isn't one. I know I need to keep improving content etc but after a year of NO improvements should I consider the link removal route? <colgroup><col width="215"></colgroup>
| Untrusted site - malware! |0 -
Same content, different target area SEO
So ok, I have a gambling site that i want to target for Australia, Canada, USA and England separately and still have .com for world wide (or not, read further).The websites content will basically stays the same for all of them, perhaps just small changes of layout and information order (different order for top 10 gambling rooms) My question 1 would be: How should I mark the content for Google and other search engines that it would not be considered "duplicate content"? As I have mentioned the content will actually BE duplicate, but i want to target the users in different areas, so I believe search engines should have a proper way not to penalize my websites for trying to reach the users on their own country TLDs. What i thought of so far is: 1. Separate webmasterstools account for every domain -> we will need to setup the user targeting to specific country in it.
White Hat / Black Hat SEO | | SEO_MediaInno
2. Use the hreflang tags to indicate, that this content is for GB users "en-GB" the same for other domains more info about it http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
3. Get the country specific IP address (physical location of the server is not hugely important, just the IP)
4. It would be great if the IP address for co.uk is from different C-class than the one for the .com Is there anything I am missing here? Question 2: Should i target .com for USA market or is there some other options? (not based in USA so i believe .us is out of question) Thank you for your answers. T0