Multiple domains different content same keywords
-
what would you advice on my case:
It is bad for google if i have the four domains.
I dont link between them as i dont want no association, or loss in rakings in branded page.
Is bad if i link between them or the non branded to them branded domain.
Is bad if i have all on my webmaster tools, i just have the branded
My google page is all about the new non penalized domain. altough google gave a unique domain +propdental to the one that he manually penalized. (doesn't make sense)
So. What are the thinks that i should not do with my domain to follow and respect google guidelines. As i want a white hat and do not do something that is wrong without knowledge
-
301 the additional domains to the one you want to focus on - except the domain that has the penalty, if you're certain it does.
-
Four domains = splitting your efforts by 4, splitting your potential links and DA by four etc.
If you work on just one domain, you can put all of your effort into it - this is the way to go.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Keyword in alt text or keyword in the body?
lets say i have 2 percent keyword density in the whole page and another 2 percent are coming from alt texts from images? is that 4 percent, am i exceeding the limit or its fine?
White Hat / Black Hat SEO | | Sam09schulz0 -
Separating the syndicated content because of Google News
Dear MozPeople, I am just working on rebuilding a structure of the "news" website. For some reasons, we need to keep syndicated content on the site. But at the same time, we would like to apply for google news again (we have been accepted in the past but got kicked out because of the duplicate content). So I am facing the challenge of separating the Original content from Syndicated as requested by google. But I am not sure which one is better: *A) Put all syndicated content into "/syndicated/" and then Disallow /syndicated/ in robots.txt and set NOINDEX meta on every page. **But in this case, I am not sure, what will happen if we will link to these articles from the other parts of the website. We will waste our link juice, right? Also, google will not crawl these pages, so he will not know about no indexing. Is this OK for google and google news? **B) NOINDEX meta on every page. **Google will crawl these pages, but will not show them in the results. We will still loose our link juice from links pointing to these pages, right? So ... is there any difference? And we should try to put "nofollow" attribute to all the links pointing to the syndicated pages, right? Is there anything else important? This is the first time I am making this kind of "hack" so I am exactly sure what to do and how to proceed. Thank you!
White Hat / Black Hat SEO | | Lukas_TheCurious1 -
What tools do you use to find scraped content?
This hasn’t been an issue for our company so far, but I like to be proactive. What tools do you use to find sites that may have scraped your content? Looking forward to your suggestions. Vic
White Hat / Black Hat SEO | | VicMarcusNWI0 -
Sudden Drop in Keyword Ranking - No Idea Why
Hi Mozzers, I am in charge of everything Web Optimization for the company I work for. I keep active track of our SEO/SEM practices, especially our keyword rankings. Prior to my arrival at the company, in January of this year, we had a consultant handling the SEO work and though they did a decent job on maintaining our rankings for a hefty set of keywords, they were unable to get a particular competitive keyword ranking. This is odd because other derivations of that keyword which are equally competitive are all still ranking on page one. Also, full disclosure, they were not engaging in any questionable linking. In fact, they didn't do much of any link building whatsoever. I also haven't been engaging in any questionable content creation or spammy linking. We put out content regularly as we are a publicly traded company - nothing spammy at all. Anyway, one thing I tried since February was engaging in a social media sharing campaign among friends and coworkers to share the respective page and keyword on their Facebook and Google+ pages. To my surprise, this tactic worked just like natural search usually does - slowly and through the months I saw the keyword rank from completely invisible, to page 6, to page 3, to page 2, and finally onto position 6 page one as of just last week. Today, unfortunately, the keyword is invisible again :(. I am perplexed. It's tough to build links for our company as we are in the public and everything we do has to be approved by someone higher up. I also checked our webmaster tools and haven't seen any notifications that can give me clue as to what's going on. I am aware that there was a Penguin update recently and there are monthly Panda updates, but I'm skeptical as to whether or not those updates would be correlated to this because, at initial glance, our traffic and rankings for other keywords and pages don't seem to be affected. Suggestions? Advice? Answers? Thanks!
White Hat / Black Hat SEO | | CSawatzky0 -
Same content, different target area SEO
So ok, I have a gambling site that i want to target for Australia, Canada, USA and England separately and still have .com for world wide (or not, read further).The websites content will basically stays the same for all of them, perhaps just small changes of layout and information order (different order for top 10 gambling rooms) My question 1 would be: How should I mark the content for Google and other search engines that it would not be considered "duplicate content"? As I have mentioned the content will actually BE duplicate, but i want to target the users in different areas, so I believe search engines should have a proper way not to penalize my websites for trying to reach the users on their own country TLDs. What i thought of so far is: 1. Separate webmasterstools account for every domain -> we will need to setup the user targeting to specific country in it.
White Hat / Black Hat SEO | | SEO_MediaInno
2. Use the hreflang tags to indicate, that this content is for GB users "en-GB" the same for other domains more info about it http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
3. Get the country specific IP address (physical location of the server is not hugely important, just the IP)
4. It would be great if the IP address for co.uk is from different C-class than the one for the .com Is there anything I am missing here? Question 2: Should i target .com for USA market or is there some other options? (not based in USA so i believe .us is out of question) Thank you for your answers. T0 -
Multiple links to different pages from same page
Hey, I have an opportunity to get listed in a themed directory page, that has a high mozRank of 4+ and a high mozTrust of 5+. Would it be better to just have one link from that page going to one of my internal product category pages, or take advantage of the 'sitelinks' they offer, that allows me to have an additional 5 anchor text links to 5 other pages? I've attached an example. sitelinks.jpg
White Hat / Black Hat SEO | | JerDoggMckoy0 -
What does Youtube Consider Duplicate content and will it effect my ranking/traffic?
What does youtube consider duplicated content? If I have a power point type video that I already have on youtube and I want to change the beginning and end call to action, would that be considered duplicate content? If yes then how would this effect my ranking/youtube page. Will it make a difference if I have it embedded on my blog?
White Hat / Black Hat SEO | | christinarule0