Template Files .tpl versus .html files
-
We sell a large selection of Insulation Products use template files (.tpl) to collect up-to-date information from a server side database file that contains some 2,500 line items.
When an HTML (.html) file is requested on the Internet, the 'example.tpl' file is accessed, the latest product and and pricing information is accessed, then presented to the viewer as 'example.html'
My question: Can the use of .tpl files negatively impact Search Engine acceptance?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Large robots.txt file
We're looking at potentially creating a robots.txt with 1450 lines in it. This will remove 100k+ pages from the crawl that are all old pages (I know, the ideal would be to delete/noindex but not viable unfortunately) Now the issue i'm thinking is that a large robots.txt will either stop the robots.txt from being followed or will slow our crawl rate down. Does anybody have any experience with a robots.txt of that size?
Intermediate & Advanced SEO | | ThomasHarvey0 -
Handling Multiple Domain 301 Redirects on Single htaccess file
Hello, I have a client that currently that has 9 different markets with different sub-domains on one server (aka one htaccess file.). All the sites have very similar Navigation and some of them contain the same products aka same URLs. The site is using Magento CMS and I'm trying to figure out how to redirect some products that have been removed from one of the stores. The problem I'm running into is when I try to redirect one store url, it redirects all the site's URLs. Example http://store.domain1.com/ http://store.domain2.com/ I'd like to redirect http://store.domain1.com/old-url.html to http://store.domain1.com/new-url.html without making http://store.domain2.com/old-url.html redirect. I've literally been pulling out my hair trying to figure this one out but have had no luck. Does anybody have any ideas on how I could do this without having the sites redirect or create any loops? Any wisdom from you apache experts would be greatly appreciated. Thanks, Erik
Intermediate & Advanced SEO | | Erik-M0 -
Best Way To Go About Fixing "HTML Improvements"
So I have a site and I was creating dynamic pages for a while, what happened was some of them accidentally had lots of similar meta tags and titles. I then changed up my site but left those duplicate tags for a while, not knowing what had happened. Recently I began my SEO campaign once again and noticed that these errors were there. So i did the following. Removed the pages. Removed directories that had these dynamic pages with the remove tool in google webmasters. Blocked google from scanning those pages with the robots.txt. I have verified that the robots.txt works, the pages are longer in google search...however it still shows up in in the html improvements section after a week. (It has updated a few times). So I decided to remove the robots.txt file and now add 301 redirects. Does anyone have any experience with this and am I going about this the right away? Any additional info is greatly appreciated thanks.
Intermediate & Advanced SEO | | tarafaraz0 -
Pros or Cons of adding Schema Markup via HTML or through Webmaster Data Highlighter
Hello, I am in the process of adding schema to a site that I am working on,, are there advantages or disadvantages to adding via html on site or through webmaster tools? Thank You
Intermediate & Advanced SEO | | TP_Marketing0 -
IP Address: Ownership Location Versus IP Resolve
We are a US based ecommerce company that recently switched hosting to a Canadian owned company. I was told we would have a US based IP address but noticed yesterday that the MOZ bar is listing my website, 1800doorbell.com as a Canadian company. I've researched this online and what's typically stated is that your IP location needs to be in the Geo area you serve. When I brought his up to my host they stated: "The location being reported by many of these tools will be the one from the WHOIS. Since our corporation is registered in Canada, it will return a matching result. You can verify the location of the address by issuing a traceroute and examining the location codes at the end of the traceroute. For example, on: 96.125.180.207" So now I am really confused. What matters to me is how the search engines see my IP address. Will/do they see it as a US IP address? Below is the output from DNSstuff and thanks for any help: This is what I received back from DNSstuff: | ASN | 12179 |
Intermediate & Advanced SEO | | jake372
| Name | INTERNAP-2BLK |
| Description | - Internap Network Services Corporation |
| # Peers | 11 |
| # IPv4 Origin Ranges | 32 |
| # IPv6 Origin Ranges | 2 |
| Registrar | ARIN |
| Allocation date | Apr 13, 1999 |
| Country Code | US | | |
| Reverse | unknown.static.dal01.cologlobal.com. |
| Reverse-verified | No |
| Origin AS | - Internap Network S... |
| Country Code | CA |
| Country | Canada |
| Region | North America |
| Population | 31592805 |
| Top-level Domain | CA |
| IPv4 Ranges | 5944 |
| IPv6 Ranges | 336 |
| Currency | Canadian Dollar |
| Currency Code | CAD |
| IP Range - Start | 96.125.176.0 |
| IP Range - End | 96.125.191.255 |
| Registrar | ARIN |
| Allocation date | May 10, 2011 |0 -
How Do I Create Multiple Pages In HTML Sitemap?
I'm working on an html sitemap for our ecommerce site and want to limit the links on each page to less than 100. I've created an article for the initial page, but what is the proper way to go to the next page? Do I create another article page (and so on and so on) until I have the sitemap completed? If so, how do I link from one page to the next? Would my on page text read: "sitemap continued" with anchor text on the link "sitemap page 2.."? It seems like all the sitemaps I've seen just fill one page with links and very little regard for "link saturation" and continuous pages. Thanks!
Intermediate & Advanced SEO | | AWCthreads0 -
"Original Content" Dynamic Hurting SEO? -- Strategies for Differentiating Template Websites for a Nationwide Local Business Segment?
The Problem I have a stable of clients spread around the U.S. in the maid service/cleaning industry -- each client is a franchisee, however their business is truly 'local' with a local service area, local phone/address, unique business name, and virtually complete control over their web presence (URL, site design, content; apart from a few branding guidelines). Over time I've developed a website template with a high lead conversion rate, and I've rolled this website out to 3 or 4 dozen clients. Each client has exclusivity in their region/metro area. Lately my white hat back linking strategies have not been yielding the results they were one year ago, including legitimate directories, customer blogging (as compelling as maid service/cleaning blogs can really be!), and some article writing. This is expected, or at least reflected in articles on SEO trends and directory/article strategies. I am writing this question because I see sites with seemingly much weaker back link profiles outranking my clients (using SEOMoz toolbar and Site Explorer stats, and factoring in general quality vs. quantity dynamics). Questions Assuming general on-page optimization and linking factors are equal: Might my clients be suffering because they're using my oft-repeated template website (albeit with some unique 'content' variables)? If I choose to differentiate each client's website, how much differentiation makes sense? Specifically: Even if primary content (copy, essentially) is differentiated, will Google still interpret the matching code structure as 'the same website'? Are images as important as copy in differentiating content? From an 'machine' or algorithm perspective evaluating unique content, I wonder if strategies will be effective such as saving the images in a different format, or altering them slightly in Photoshop, or using unique CSS selectors or slightly different table structures for each site (differentiating the code)? Considerations My understanding of Google's "duplicate content " dynamics is that they mainly apply to de-duping search results at a query specific level, and choosing which result to show from a pool of duplicate results. My clients' search terms most often contain client-specific city and state names. Despite the "original content" mantra, I believe my clients being local businesses who have opted to use a template website (an economical choice), still represent legitimate and relevant matches for their target user searches -- it is in this spirit I ask these questions, not to 'game' Google with malicious intent. In an ideal world my clients would all have their own unique website developed, but these are Main St business owners balancing solutions with economics and I'm trying to provide them with scalable solutions. Thank You! I am new to this community, thank you for any thoughts, discussion and comments!
Intermediate & Advanced SEO | | localizedseo0 -
Should we block urls like this - domainname/shop/leather-chairs.html?brand=244&cat=16&dir=ascℴ=price&price=1 within the robots.txt?
I've recently added a campaign within the SEOmoz interface and received an alarming number of errors ~9,000 on our eCommerce website. This site was built in Magento, and we are using search friendly url's however most of our errors were duplicate content / titles due to url's like: domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=1 and domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=4. Is this hurting us in the search engines? Is rogerbot too good? What can we do to cut off bots after the ".html?" ? Any help would be much appreciated 🙂
Intermediate & Advanced SEO | | MonsterWeb280