How do I get this program to see url with www. and with out www the same
-
The program is showing pages with www. as a differant page from a page with out the www. first, this is showing up as duplicate pages when they are the same page, how do I filter this?
-
Thank you for such a quick responce. you answered more then you know with this answer, I also had problems with URLs that I capitalized in the past and did not know how to fix it. Now I know
Thank you
-
You don't want to filter it! Instead you want to work out why it's doing that, and then fix it. This is going to be done by adding a rel="canonical" link for starters, and then also 301 redirecting the non-WWW to the WWW or vice versa.
You do NOT want Google to see duplicate content, you want all roads to lead to the one.
Therefore look into htaccess 301 redirection (for Apache) or similar.
SEOMoz has a bunch of info about duplication and redirection:
The latter is extremely extremely important for SEO. Duplicate content is bad, and at the moment all of your content has the potential for duplication so you want to ensure that there's only one version of everything.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Proper URL Structure. Feedback on Vendors Recommendation
Urgent! We're doing a site redesign and our vendor recommended new url structure as follows: website.com/folder/word1word2word3. Our current structure is website.com/word1-word2 They said that from SEO perspective, it doesn't make a difference if there are dashes between words or not and Google can read either URL. Is that true? I need experts to weigh on the above, as well as SEO implications if we were to implement their suggestion.
On-Page Optimization | | bluejay78780 -
Content with changing URL and duplicate content
Hi everyone, I have a question regarding content (user reviews), that are changing URL all the time. We get a lot of reviews from users that have been dining at our partner restaurants, which get posted on our site under (new) “reviews”. My worry however is that the URL for these reviews is changing all the time. The reason for this is that they start on page 1, and then get pushed down to page 2, and so on when new reviews come in. http://www.r2n.dk/restaurant-anmeldelser I’m guessing that this could cause for serious indexing problems? I can see in google that some reviews are indexed multiple times with different URLs, and some are not indexed at all. We further more have the specific reviews under each restaurant profile. I’m not sure if this could be considered duplicate content? Maybe we should tell google not to index the “new reviews section” by using robots.txt. We don’t get much traffic on these URLs anyways, and all reviews are still under each restaurant-profile. Or maybe the canonical tag can be used? I look forward to your input. Cheers, Christian
On-Page Optimization | | Christian_T2 -
www vs no-www duplicate which should I use
site is no-www I caught this in archives. Will this by my fix? Mike Davis Online Marketing Manager at McKesson May 22, 2013 Easy fix: in your .htaccess file, use this RewriteEngine On
On-Page Optimization | | touristips
RewriteCond %{HTTP_HOST} !^domain.com
RewriteRule (.*) http://domain.com/$1 [R=301,L] Remember to replace domain.com with your domain name.
Enjoy!0 -
How Much Does a Missing www. 301 redirect hurt a business?
We're preparing a report for a potential client, and are trying to figure out a way to estimate rankings gains. One of the major issues is a lack of a 301 redirect for non-www. domains to www. domains. We checked and there's no canonicalization, so it's a clear issue. According to Google, the non-www. links from 8 different domains. The www. version of the website has links from 248 different domains. Nearly all anchor text is branded, as they've never had any SEO work done before. Does anyone have a suggestion for approximating benefits of setting up their .htaccess file correctly? Would the benefits even be that great? We're of course advising additional things, but we just want to be more certain about this step's SEO-boost.
On-Page Optimization | | FlynnZaiger0 -
How do i get my on-page grade reports?
I've been signed up for a week now and still nothing in there? Do I need to set something up? thanks in advance! Fraser
On-Page Optimization | | vipgambler0 -
Search engine friendly URLs
I'm going to create some new content for my site, I'm trying to decide on the best search engine friendly format. Namely, is it ok to use a subdirectory or should I keep all content on root level? Is the SEO effect of either of these URLs superior to the other? domain.com/cooking/lasagna.php vs domain.com/lasagna.php
On-Page Optimization | | limens0 -
301 redirect and then keywords in URL
Hi, Matt Cutts says that 301 redirects, including the ones on internal pages, causes the loss of a little bit of link juice. But also, I know that keywords in the URL are very important. On our site, we've got unoptimized URLs (few keywords) in the internal pages. Is it worth doing a 301 redirect in order to optimize the URLs for each main page. 301 redirects are the only way we can do it on our premade cart For example (just an example) say our main (1 of the 4) keywords for the page is "brown shoes". I'm wondering if I should redirect something like shoes.com/shoecolors.html to shoes.com/brown-shoes.html In other words, with the loss of juice would we come out ahead? In what instances would we come out ahead?
On-Page Optimization | | BobGW0 -
Getting pages indexed by Google
Hi SEOMoz, I relaunched a site back in February of this year (www.uniquip.com) with about 1 million URL's. Right now I'm seeing that Google is not going past 110k indexed URL's (based on sitemaps). Do you have any tips on what I can do to make the site more likeable by Google and get more indexed URL's? All the the part pages can be browsed to by going to: http://www.uniquip.com/product-line-card/suppliers/sw-a/p-1 I've tried to make the content as unique as possible by adding random testimonials and random "related part numbers" see here: http://www.uniquip.com/id/246172/electronic-components/infineon/microcontrollers-mcu/sabc161pilfca Do I need to wait more time and be more patient with Google? It just seems like I'm only getting a few thousand URL's per day at the most. Would it help me if I implemented a breadcrumb on all part pages? Thanks, -Carlos
On-Page Optimization | | caneja0