Having portal page that takes you to website with a different url
-
We are in the planning stages for this.
Our client wants his (as yet) domain name to be a portal page for this new campaign.
His domain name is a non-keyword company name (i.e. widgetsgalore.com)
We already have a website with content tailored to his business ready to go. In fact, we did a campaign back in '06 to '09 that was highly successful. At that time it was just the webpage with a keyword rich url.
Now for some reason the client wants his company name url (widgetsgalore.com) to be the portal page (landing page) that once potential clients click on it takes them to the website with the content.
What are the pros and cons of doing what client asks about making his widgetsgalore.com a portal page vs. going directly to the url with all the content/forms, etc?
This is a local site, with audience limited to southern california.
-
Appreciate your feedback.
This clarifies the issues for us.
The timing worked out perfectly ... so yes, your response worked for us.
No doorway page. Thanks for adding the link to Matt Cutts article.
-
Short version: Do not do this!
Long version: What the client is asking is for you to create a "doorway page." This was a very popular black-hat SEO tactic years ago. Here's some background and additional links from Wikipedia.
Such a tactic is very bad for many reasons. Here are just a few:
1. It is bad for the user experience. If I click on a search-results page to go to a website and am interrupted by a doorway page, that makes me angry. It's an unnecessary step that makes it take longer to get to where I want to go.
2. It's bad for SEO. Doorway pages are typically associated with keyword spam. As such, Google really, really, hates doorway pages. Here's an old but relevant post by Matt Cutts (Google's head of web spam). I can all but guarantee you that his site will be whacked by Google if he uses doorway pages.
Your client is likely relying on very old (and very bad!) SEO advice. He should be reminded that SEO is not a "bag of tricks" to get a site to rank first in Google very quickly. It is a collection of technical and other best practices that help to improve the experiences of both humans and search engines on a website.
Sorry I didn't see this question sooner -- I hope it worked out!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using the Onpage Grader for Local Business websites
Hey Guys, Curious how people use the onpage grader for optimizing pages for local businesses specifically, I'm interested if people use keywords with or without a geo modifier since adding a geo modifier will prevent more natural writing to increase the score. If you don't use a geo modifier do you have some general rules of the city that needs to be in the H1 and first paragraph etc. Any tips for using the page grader for local businesses would be great Thanks!
Local Website Optimization | | solidlocal0 -
Do old backlinks still help with new URL with 301 redirect? Also I added the www. How does this affect it all?
I changed my URL from exampledetailing. com to exampleautodetailing. com. It is redirected with a 301. Also, it is on Squarespace AND I opted to add the www. So will the old backlinks of exampledetailing. com still help the new URL exampleautodetailing. com or do I need to try and update all the links? Also, future links, do I need to include the www. or just the root domain of exampleautodetailing. com or even the whole https://wwwexampleautodetailing. com? I believe the www is considered a sub domain and a new entity on Google, so I am not sure how that works. Thank you!
Local Website Optimization | | Rmarkjr810 -
What's the current best practice for URL structure?
We’re really confused about the current best practice of URL structure. For example what would anyone advise to rank for luxury hotel rooms? name.com/luxury-hotel-rooms/
Local Website Optimization | | SolveWebMedia
name.com/hotel/luxury-hotel-rooms/
name.com/hotel/luxury-rooms/
name.com/hotel/luxury/
name.com/luxury-rooms/ Or do we add location? name.com/luxury-hotel-rooms-location/
name.com/hotel/luxury-hotel-rooms-location/ name.com/hotel/luxury-rooms-location/ They also do cottages name.com/cottages/sea-view-holiday-cottages/0 -
SERP: From page 4 to page 1 to page 4 again -_- ...
Hi there Moz Amigos! So I have this Website: campmusicaladagio.com Right now, our main target keyword is "camp de jour gatineau". The website was on WIX before. So, I created the worpress version and redirected the domain name to the new hosting server (outside of WIX). So before doing the changes, the website was on page 4... After the changes, it went in 1 week on page 1 (lol, WIX sucks so much). After 3 weeks on page 1, it went on page 4 again... I am so confused XD like what the hell happen... Any ideas?
Local Website Optimization | | Gab-SEO0 -
Remove URLs from App
Hi all, our tech team inherited a bit of an SEO pickle. I manage a freemium React JS app built for 80k unique markets worldwide (and associated dedicated URL schema). Ex/ https://www.airdna.co/vacation-rental-data/app/us/california/santa-monica/overview Mistake - App, in its entirety, was indexed by Google in July 2018, which basically resulted in duplicate content penalties because the unique on-page content wasn't readable. Partial Solution - We no indexed all app pages until we were able to implement a "pre-render" / HTML readable solution with associated dynamic meta data for the Overview page in each market. We are now selectively reindexing only the free "Overview" pages that have unique data (with a nofollow on all other page links), but want to persist a noindex on all other pages because the data is not uniquely "readable" before subscribing. We have the technical server-side rules in place and working to ensure this selective indexing. Question - How can we force google to abandoned the >300k cached URLs from the summer's failed deploy? Ex/ https://screencast.com/t/xPLR78IbOEao, would lead you to a live URL such as this which has limited value to the user, https://www.airdna.co/vacation-rental-data/app/us/arizona/phoenix/revenue (Note Google's cached SERPs also have an old URL structure, which we have since 301ed, because we also updated the page structure in October). Those pages are currently and will remain noindexed for the foreseeable future. Our sitemap and robots.txt file is up-to-date, but the old search console only has a temporary removal on a one-by-one basis. Is there a way to do write a rule-based page removal? Or do we simply render these pages in HTML and remove the nofollow to those links from the Overview page so a bot can get to them, and then it would see that there's a noindex on them, and remove them from the SERPs? Thanks for your help and advice!
Local Website Optimization | | Airbnb_data_geek1 -
Dual website strategy
We have two websites (different businesses) in the technology sector that sell the same products on the same platform (OSC) but have different branding. We have tried to make the static content different and the user generated content is different. SEO as largely different. But the one site has much better rankings than the other. Whilst the under performing site is not responsive yet, I need to decide whether to merge the two businesses into one or continue on the two separate websites approach. I would only pursue the latter approach and invest further time and effort into this under performing website if I knew I was "on the right" track. My SEO knowledge is not extensive and so I would be interested in any views the community has? I note that kogan.com.au and dicksmith.com.au have a similar dual website approach (same company) and they are both major brands in Australia. I thank you in advance for any thoughts you may have.
Local Website Optimization | | Alpine91 -
Boost Website Traffic
Tom Beavan Websites is my business where I create and design affordable websites for small businesses in Wordpress. I am looking to improve my traffic to my website as dramatically as possible. At present, my website is a one-page website with limited content - https://www.tombeavan.co.uk. My website ranks #1 for local keywords like: Web design Wiltshire Web design Trowbridge Wordpress developer UK So in terms of keyword position, I am doing well for local business but I only get 200-300 visitors per month. I would like to dramatically improve this to improve the number of enquiries I get. I do tend to get a few enquiries but think if I improve the website traffic, the quantity of website enquiries will increase too? I have a long list of tasks I would like to do for SEO: Add a lot more content to the website Add more backlinks Guest blogging Lots more What would you recommend a good starting place or a place which will increase traffic effectively? Thanks for your advice in advance 🙂
Local Website Optimization | | tombeavan0 -
Which is better for Local & National coupons --1000s of Indexed Pages per City or only a Few?
Not sure where this belongs.. I am developing a coupons site for listing local coupons and national coupons (think Valpak+RetailMeNot), eventually in all major cities, and am VERY concerned about how many internal pages to let google 'follow' for indexing, as it can exceed 10,000 per city. Is there a way to determine what the optimal approach is for internal paging/indexing BEFORE I actually launch the site (it is about ready except for this darned url question, which seems critical) Ie can I put in searchwords for google to determine which ones are most worthy to have their own indexed page? I'm a newbie sort of, so please put answer in simple terms. I'm one person and have limited funds and need to find the cheapest way to get the best organic results for each city that I cover. Is there a generic answer? One SEO firm told me the more variety the better. Another told me that simple is better, and use content on the simple pages to get variety. So confused I decided to consult the experts here! Here's the site concept: **FOR EACH CITY: ** User inputs location: Main city only(ie Houston), or 1 of 40 city regions(suburb, etc..), or zip code, or zip-street combo, OR allow gps lookup. A miles range is defaulted or chosen by the user. After search area is determined, user chooses 1 of 6 types of coupons searches: 1. Online shopping with national coupon codes, choice of 16 categories (electronics, health, clothes, etc) and 100 subcategories (computers, skin care products, mens shirts) These are national offers for chains like Kohls, which do not use the users location at all. 2. Local shopping in-store coupons, choice of same 16 categories and 100 subcategories that are used for online shopping in #1 (mom & pop shoe store or local chain offer). The results will be within the users chosen location and range. 3. Local restaurant coupons, about 60 subcategories (pizza, fast food, sandwiches). The results are again within the users chosen location and range. 4. Local services coupons, 8 categories (auto repair, activities,etc..) and around 200 subcategories (brakes, miniature golf, etc..). Results within users chosen location and range. 5. Local groceries. This is one page for the main city with coupons.com grocery coupons, and listing the main grocery stores in the city. This page does not break down by sub regions, or zip, etc.. 6. Local weekly ad circulars. This is one page for the main city that displays about 50 main national stores that are located in that main city. So, the best way to handle the urls indexed for the dynamic searches by locations, type of coupon, categories/subcats, and business pages The combinations of potential urls to index are nearly unlimited: Does the user's location matter when he searches for one thing (restaurants), but not for another (Kohls)? IF so, how do I know this? SHould I tailor indexed urls to that knowledge? Is there an advantage to having a url for NATIONAL cos that ties to each main city: shopping/Kohls vs shopping/Kohls/Houston or even shopping/Kohls/Houston-suburb? Again, I"m talking about 'follow' links for indexing. I realize I can have google index just a few main categories and subcats and not the others, or a few city regions but not all of them, etc.. while actually having internal pages for all of them.. Is it better to have 10,000 urls for say coupon-type/city-region/subcategory or just one for the main city: main-city/all coupons?, or something in between? You get the gist. I don't know how to begin to figure out the answers to these kinds of questions and yet they seem critical to the design of the site. The competition: sites like Valpak, MoneyMailer, localsaver seem to favor the 'more is better' approach, with coupons/zipcode/category or coupons/bizname/zipcode But a site like 8coupons.com appears to have no indexing for categories or subcategories at all! They have city-subregion/coupons and they have individual businesses bizname/city-subregion but as far as I see no city/category or city-subregion/category. And a very popular coupons site in my city only has maincity/coupons maincity/a few categories and maincity/bizname/coupons. Sorry this is so long, but it seems very complicated to me and I wanted to make the issue as clear as possible. Thanks, couponguy
Local Website Optimization | | couponguy1