Dates in URL's
-
I have an issue of duplicate content errors and duplicate page titles which is penalising my site. This has arisen because a number of URLs are suffixed by date(s) and have been spidered . In principle I do not want any url with a suffixed date to be spidered.
Eg:-
www.carbisbayholidays.co.uk/carbis-bay/houses-in-carbis-bay/seaspray.htm/06_07_13/13_07_13
http://www.carbisbayholidays.co.uk/carbis-bay/houses-in-carbis-bay/seaspray.htm/20_07_13/27_07_13
Only this URL should be spidered:-
http://www.carbisbayholidays.co.uk/carbis-bay/houses-in-carbis-bay/seaspray.htm
I have over 10,000 of these duplicates and firstly wish to remove them on block from Google ( not one by one ) and secondly wish to amend my robots.txt file so the URL's are not spidered. I do not know the format for either.
Can anyone help please.
-
Thanks Kyle.
Particularly grateful for the Disallow format, they are the only URL's using an underscore so will work for me. WIll be checking why these are being created.
Do I need to remove them using the Removal Tool in Google, is there a format for doing this on block ?
Thanks again,
Alan
-
Hi Alan,
I would probably start by adding a disallow rule to robots.txt.
**Disallow: /*_** _may work and block all your dated URLs from being indexed but may also have adverse affects if you have any URLs containing underscores. To test whether this solution would work I would firstly implement a disallow directly on a chosen dated URL, _**Disallow: /20_07_13 **_for example, and then test whether Google has noindexed the page. GWT should tell you whether you have inadvertently blocked any other pages by doing so.
You should also be thinking about how these URLs are being created and taking actions to prevent it. Consider implementing canonical tags if you haven't already to clean up any potential duplication issues.
Cheers,
K
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why the url inspection is disabled in search console ?
In this situation, how can we make our pages be fetched by google?
On-Page Optimization | | supporthandle0 -
Should I change my website urls?
We're translating our website in a few languages (FR / DE / JP) using subdirectories. So our website will have the following urls www.brand.com/en
On-Page Optimization | | dcalexandra
www.brand.com/fr
www.brand.com/de
www.brand.com/jp I would like to change the url structure of a few pages from www.brand.com/section/feature-name to www.brand.com/feature-name Is it a good idea to do this now since we're adding the subfolders and these are anyway new urls in google's eyes?0 -
To avoid the duplicate content issue I have created new urls for that specific site I am posting to and redirecting that url to the original on my site. Is this the right way to do it?
I am trying to avoid the duplicate content issue by creating new urls and redirecting them to the original url. Is this the proper way of going about it?
On-Page Optimization | | yagobi210 -
Duplicate Content with ?Page ID's in WordPress
Hi there, I'm trying to figure out the best way to solve a duplicate content problem that I have due to Page ID's that WordPress automatically assigns to pages. I know that in order for me to resolve this I have to use canonical urls but the problem for me is I can't figure out the URL structure. Moz is showing me thousands of duplicate content errors that are mostly related to Page IDs For example, this is how a page's url should look like on my site Moz is telling me there are 50 duplicate content errors for this page. The page ID for this page is 82 so the duplicate content errors appear as follows and so on. For 47 more pages. The problem repeats itself with other pages as well. My permalinks are set to "Post Name" so I know that's not an issue. What can I do to resolve this? How can I use canonical URLs to solve this problem. Any help will be greatly appreciated.
On-Page Optimization | | SpaMedica0 -
I want to improve our client's website structure, so he gets more traffic locally. What advice do you have ?
We want to "revamp" our client's website, by improving the overall looking (content, images, structure). Our client is a small retail business but wants to have more traffic. What advice can you give me ?
On-Page Optimization | | marketingmedia.ca0 -
My company's product is referred to by two different names (SVN and Subversion). When cleaning up our Title tags, is it OK to use either name to keep the title tags around 70 characters?
I am cleaning up title tags that are too long or not correct. In our title tag we reference our product (a version of OSS source code). This product is often referred to as both SVN or Subversion. When writing Title tags is it OK to use one or the other depending on the length of the Title Tag? For instance: Contact Us | Free SVN & Git Hosting | Bug & Issue tracking | CloudForge vs **About CloudForge | Free Subversion & Git Hosting | Bug Tracking ** | |
On-Page Optimization | | CollabNet0 -
Title tag and URL Optimization
Hello guys, Should the URL reflect the structure of the title of a webpage? This is the old title with the Url: 20mm O/D Black Polypropylene LSZH Flexible Conduit 100m Coil /Product/20mm-o-d-black-polypropylene-lszh-conduit-100m-coil/1352 I changed the keyword position and it looks like this: 20mm Flexible Conduit | O/D Black Polypropylene LSZH | 100m I kept the same Url for now, should I change that too? Thanks
On-Page Optimization | | PremioOscar0 -
What's the best strategy for reducing the number of links on a blog post?
I'd like to optimize my blog better for search. The first reccomendation I got from my SEOMoz Pro Campaign Crawl was that I needed to reduce the number of links per page on my site. I have lots of links from navigational items in the sidebar that people do click on. I'd really like to keep some or all of the tags and categories I list. Comments are another issue. Most of our posts get about 10 comments. However, our best posts get 50-100 comments. Those comments create a lot of links. I was planning on attempting to reduce the number of links using javascript but I guess Google understands javascript now. I may still do this b/c our pages are huge and some progressive rendering would likely help the user experience. Can you use javascript (ajax or otherwise) to limit the number of links on your page in a way that helps your SEO efforts? Any specific suggestions for reducing links that come from comments and navigational items? How much will reducing the number of links on a given page help with SEO? Any simple way to estimate or quantify this without diving in? Thanks in advance!
On-Page Optimization | | TaitLarson0