Dates in URL's
-
I have an issue of duplicate content errors and duplicate page titles which is penalising my site. This has arisen because a number of URLs are suffixed by date(s) and have been spidered . In principle I do not want any url with a suffixed date to be spidered.
Eg:-
www.carbisbayholidays.co.uk/carbis-bay/houses-in-carbis-bay/seaspray.htm/06_07_13/13_07_13
http://www.carbisbayholidays.co.uk/carbis-bay/houses-in-carbis-bay/seaspray.htm/20_07_13/27_07_13
Only this URL should be spidered:-
http://www.carbisbayholidays.co.uk/carbis-bay/houses-in-carbis-bay/seaspray.htm
I have over 10,000 of these duplicates and firstly wish to remove them on block from Google ( not one by one ) and secondly wish to amend my robots.txt file so the URL's are not spidered. I do not know the format for either.
Can anyone help please.
-
Thanks Kyle.
Particularly grateful for the Disallow format, they are the only URL's using an underscore so will work for me. WIll be checking why these are being created.
Do I need to remove them using the Removal Tool in Google, is there a format for doing this on block ?
Thanks again,
Alan
-
Hi Alan,
I would probably start by adding a disallow rule to robots.txt.
**Disallow: /*_** _may work and block all your dated URLs from being indexed but may also have adverse affects if you have any URLs containing underscores. To test whether this solution would work I would firstly implement a disallow directly on a chosen dated URL, _**Disallow: /20_07_13 **_for example, and then test whether Google has noindexed the page. GWT should tell you whether you have inadvertently blocked any other pages by doing so.
You should also be thinking about how these URLs are being created and taking actions to prevent it. Consider implementing canonical tags if you haven't already to clean up any potential duplication issues.
Cheers,
K
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does changing/shortening a url hurt SEO?
Hi all, I am in the process of making small optimization changes to my site. I noticed Moz identified quite a few URLs that could be shortened. I intend to shorten these URLs and create 301 redirects to ensure website users land on the right page. My question is, will this change in URL damage rankings and engagement(assuming the URL remains content relevant)? I have read in some places that when creating URL redirects for a change in domain, people saw a dip in rankings and engagement. I, however, am not intending to change the main domain of the site, but rather the URL slug. Any thoughts?
On-Page Optimization | | annegretwidmer0 -
What's the best Magento Community blog extension?
We are looking at FishPig's Word Press Integrations extension. has anybody used it? Possibly a dumb question, but is SEO adversely affected by the fact it's a WordPress extension on a Magento site?
On-Page Optimization | | Anne_Marie_English0 -
How to Structure URL's for Multiple Locations
We are currently undergoing a site redesign and are trying to figure out the best way to structure the URL's and breadcrumbs for our many locations. We currently have 60 locations nationwide and our URL structure is as follows: www.mydomain.com/locations/{location} Where {location} is the specific street the location is on or the neighborhood the location is in. (i.e. www.mydomain.com/locations/waterford-lakes) The issue is, {location} is usually too specific and is not a broad enough keyword. The location "Waterford-Lakes" is in Orlando and "Orlando" is the important keyword, not " Waterford Lakes". To address this, we want to introduce state and city pages. Each state and city page would link to each location within that state or city (i.e. an Orlando page with links to "Waterford Lakes", "Lake Nona", "South Orlando", etc.). The question is how to structure this. Option 1 Use the our existing URL and breadcrumb structure (www.mydomain.com/locations/{location}) and add state and city pages outside the URL path: www.mydomain.com/{area} www.mydomain.com/{state} Option 2 Build the city and state pages into the URL and breadcrumb path: www.mydomain.com/locations/{state}/{area}/{location} (i.e www.mydomain.com/locations/fl/orlando/waterford-lakes) Any insight is much appreciated. Thanks!
On-Page Optimization | | uBreakiFix0 -
URL parameters
Hello, Currently, I paginated a content to 5 pages eg: http://abc.com/faqs.html?&page=2 Is it right? and how to check it is correct or not?
On-Page Optimization | | JohnHuynh0 -
Moving our current homepage to a new URL
Our homepage currently speaks to a specific product and we're re-doing our homepage to be more about the brand which links to the product. The current home page has PA of 62 with thousands of links to the page. Question is are there any best practices around this or any risks? So current page is: www.xyz.com which we will be refreshing then moving the existing content to www.xyz.com/product so all the subdirectories gets shifted over 1 Thank in advance for the help!
On-Page Optimization | | JoeLin0 -
URL and SEO
How much weight do search engines give the URL? We're a medical call center provider and medicalcallcenter is part of our URL. Does that help us much? Thanks!!
On-Page Optimization | | THMCC0 -
Robots.txt: excluding URL
Hi, spiders crawl some dynamic urls in my website (example: http://www.keihome.it/elettrodomestici/cappe/cappa-vision-con-tv-falmec/714/ + http://www.keihome.it/elettrodomestici/cappe/cappa-vision-con-tv-falmec/714/open=true) as different pages, resulting duplicate content of course. What is syntax for disallow these kind of urls in robots.txt? Thanks so much
On-Page Optimization | | anakyn0