Is it bad to have same templates for all of my EMDs
-
I have been working on EMDs and they are more than 40 EMDs. All 2 KW EMDs with DA around 35
Now they all have same templates. Can it be a problem in future ? (though they don't have similar content)
-
yeah fair enough .. Thanks Andy
-
Consider this, a huge chunk of the websites out there use off the shelf templates (think Wordpress). Even Matt cuts blog is an off the shelf (if edited) template.
-
hey Gerd, thanks thats helpful. I searched it and found some nice reads! Didn't know about manual reviews. But I guess manual review is somtime good as If you have good quality brand sites (EMDs) then you dont have to worry about machine algo picking your site by mistake.
-
That's a good example. Thanks for sharing.
Though, I guess if something like this happens it wont be template update alone, it might be a combination of low quality, bit of similar pattern seen in content and few other factors like all similar EMDs with similar templates interlinked and so on.
-
thanks for sharing your opinion .
-
This is already happening - do a search for "google manual review" and have a look at the manual review process. Google currently employs companies to perform manual website reviews based on search terms to classify web-sites.
So although you have the same template, different content and websites, the danger is that your sites from a link-building perspective interlink and a manual review might demote all of them.
Chances are slim as others said, but certainly possible.
-
no clue seriously! Google is in the teen age and doing all the smart things that a super awesome kid should do to prove himself the best among others (sometime it didn't went well...)
I must not say it is not possible but at the moment there is no such thing like this!
-
IMHO, no I don't believe so.
Consider an ecommerce platform like opencart. The default template is probably being used tens of thousands of time with unqiue content and to penalize all those sites because of it , probably will not improve the end user's experience.
The purpose of algo updates is to improve the user's experience--so the content is the key and not necessarily the template (unless it's extremely poor and affects usability).
-
yeah, thought so but do you think any future update might include anything like that ?
-
I am assuming that you are talking about design template of the EMDs (websites) so in that case from the SEO point of view there won’t be any problem but from the user point of view people might frustrates by going to different domains but finding the same kind of websites... but again every niche’s behavior towards website is kind of different.
from the technical website there is no problem in having the same template!
-
Hi,
So not to worry about the same template, if your site having unique content then same design doesn't harm your ranking.
-
yes, even my sites are ranking at top 5 for most of the keywords but I am afraid some future update might change things!
-
yeah I believe at the moment thers is no problem I am assuming that Google algorithm updates might introduce something like that in future where it might use same templates as one factor to know if the sites are all same, hosted on same servers and liked with each other and hence might be come under spam
-
You could face the risk if your sites are interlinked and a manual review flags the sites as similar and demotes some. I think this is a very rare case and it will be unlikely that it could happen. Just remember, Google has a better understanding of link-graphs then any tool available. I have seen some sites drop due to a manual review (the demotion was not because of same UI though).
I honestly would not worry too much about it as long as your copy, brand, keywords and onpage SEO differs.
-
I am not sure but as per my experience, it is not affect your SEO if you have same template but different content. If you have different content for all your site then Google does not consider your sites as duplicate site. My client have multilingual site having same template but still he ranked in top 10.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google is indexing bad URLS
Hi All, The site I am working on is built on Wordpress. The plugin Revolution Slider was downloaded. While no longer utilized, it still remained on the site for some time. This plugin began creating hundreds of URLs containing nothing but code on the page. I noticed these URLs were being indexed by Google. The URLs follow the structure: www.mysite.com/wp-content/uploads/revslider/templates/this-part-changes/ I have done the following to prevent these URLs from being created & indexed: 1. Added a directive in my Htaccess to 404 all of these URLs 2. Blocked /wp-content/uploads/revslider/ in my robots.txt 3. Manually de-inedex each URL using the GSC tool 4. Deleted the plugin However, new URLs still appear in Google's index, despite being blocked by robots.txt and resolving to a 404. Can anyone suggest any next steps? I Thanks!
Technical SEO | | Tom3_150 -
Query string parameters always bad for SEO?
I've recently put some query string parameters into links leading to a 'request a quote' form which auto-fill the 'product' field with the name of the product that is on the referring product page. E.g. Red Bicycle product page >>> Link to RFQ form contains '?productname=Red-Bicycle' >>>> form's product field's default value becomes 'Red-Bicycle' I know url parameters can lead to keyword cannibalisation and duplicate content, we use sub-domains for our language changer. BUT for something like this, am I potentially damaging our SEO? Appreciate I've not explained this very well. We're using Kentico by the way, so K# macros are a possibility (I use a simple one to fill the form's Default Field).
Technical SEO | | landport0 -
The W3C Markup Validation Service - Good, Bad or Impartial?
Hi guys, it seems that now days it is almost impossible to achieve 0 (Zero) Errors when testing a site via (The W3C Markup Validation Service - https://validator.w3.org). With analytic codes, pixels and all kind of tracking and social media scripts gunning it seems to be an unachievable task. My questions to you fellow SEO'rs out there are 2: 1. How important and to what degree of effort do you go when you technically review a site and make the decision as to what needs to be fixed and what you shouldn't bother with. 2. How do you argue your corner when explaining to your clients that its impossible to active 100% validation. *As a note i will say that i mostly refer to Wordpress driven sites. would love ot hear your take. Daniel.
Technical SEO | | artdivision0 -
Using the Moz to weed out bad backlinks
How do you use the opensite explorer to weed out bad backlinks in your profile, and then how do you remove them if you cannot contact the various webmasters.
Technical SEO | | marketing-man19900 -
Can hotlinking images from multiple sites be bad for SEO?
Hi, There's a very similar question already being discussed here, but it deals with hotlinking from a single site that is owned by the same person. I'm interested whether hotlinking images from multiple sites can be bad for SEO. The issue is that one of our bloggers has been hotlinking all the images he uses, sometimes there are 3 or 4 images per blog from different domains. We know that hotlinking is frowned upon, but can it affect us in the SERPs? Thanks, James
Technical SEO | | OptiBacUK0 -
Is there a tool to figure out bad backlinks
With the new changes to the google algorithm. I'm trying to figure out what links google may think are hurting my site. Any thoughts? Thanks
Technical SEO | | MQMORAN23230 -
How many steps for a 301 redirect becomes a "bad thing"
OK, so I am not going to worry now about being a purist with the htaccess file, I can't seem to redirect the old pages without redirect errors (project is an old WordPress site to a redesigned WP site). And the new site has a new domain name; and none of the pages (except the blog posts) are the same. I installed the Simple 301 redirects plugin on old site and it's working (the Redirection plugin looks very promising too, but I got a warning it may not be compatible with the old non-supported theme and older v. of WP). Now my question using one of the redirect examples (and I need to know this for my client, who is an internet marketing consultant so this is going to be very important to them!): Using Redirect Checker, I see that http://creativemindsearchmarketing.com/blog --- 301 redirects to http://www.creativemindsearchmarketing.com/blog --- which then 301 redirects to final permanent location of http//www.cmsearchmarketing.com/blog How is Google going to perceive this 2-step process? And is there any way to get the "non-www-old-address" and also the "www-old-address" to both redirect to final permanent location without going through this 2-stepper? Any help is much appreciated. _Cindy
Technical SEO | | CeCeBar0 -
Too many 301 redirects - good or bad?
Hi, Currently, page A is redirecting to page B. I am in the process of developing new site for the same domain and this time page B will be redirected to page C. This is gonna happen on many pages. Is it correct or should i adopt some other strategy? Will it have adverse effect on the speed of my site? Page A -----> Page B ------> Page C Regards, Shailendra
Technical SEO | | IM_Learner0