How to fix this issue?
-
I redesign my website from Wix to HTML.
Now URLs changed to _
http://www.spinteedubai.com/#!how-it-works/c46c
To
http://www.spinteedubai.com/how-it-works.html
Same for all other pages. How I can fix this issue and both pages were also indexed in google.
-
Hi Alexander,
While there are some server-side technical things you can do to force a 404 error for a given URL, the best thing to do is remove the content in question from your server. At the very least this should achieve getting a 404 status code when you attempt to visit the URL that once housed the content. Ideally, if you can configure a custom 404 page that is more user-friendly, that's even better.
Now, depending on how your server is configured, there may be instances when a URL should produce a 404 error, but doesn't. I only bring this scenario up as a possibility because it's something I am currently dealing with on one of the sites I manage.
In any case, you may need to work closely with your server administrator or Web developer to achieve what you need. Most likely, it's just a matter of removing the old content from the server. Hope that helps!
Dana
-
How can I add 404 error? What are the steps?
-
Hi Alexander,
It looks like you've implemented the canonical tags properly. It can, however, take Google a very, very long time (sometimes years) to remove old content. If you really want the old page/URL out of Google's index, the very best and quickest way to achieve that is to make sure that the old page produces a proper 404 status code then use GWT's Remove URL tool to request Google to remove it from their index. This still isn't immediate, but I've seen URLs removed in as little as a week using this method. Hope that helps!
Dana
-
Hi Alexander,
You can either 301 the old page http://www.spinteedubai.com/#!how-it-works/c46c into the new page http://www.spinteedubai.com/how-it-works.html
or you can set up rel=canonical tag if its the same content and you want the keep the URL.
You would then have to either wait or use this to remove the URL - https://www.google.com/webmasters/tools/removals
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Magento Rewrite Issue
Moz's Crawler has thrown up a bunch of crawl issue for my site.The site is a magento based site and I recently updated the themes so some routes may have have become redundant. Moz has identified 289 pages with Temporary Redirect. I thought magento managed the redirects if I set the "Auto-redirect to Base URL" to Yes(301 Moved permanently). But this is enabled on my store and I still get the errors. The only thing I could think of was to add a Robots.txt and handle the redirection of these links from here. But handling redirection for 289 links is no mean task. I was looking for any ideas that could fix this without me manually doing this .
Technical SEO | | abhishek19860 -
How to fix google index filled with redundant parameters
Hi All This follows on from a previous question (http://moz.com/community/q/how-to-fix-google-index-after-fixing-site-infected-with-malware) that on further investigation has become a much broader problem. I think this is an issue that may plague many sites following upgrades from CMS systems. First a little history. A new customer wanted to improve their site ranking and SEO. We discovered the site was running an old version of Joomla and had been hacked. URL's such as http://domain.com/index.php?vc=427&Buy_Pinnacle_Studio_14_Ultimate redirected users to other sites and the site was ranking for buy adobe or buy microsoft. There was no notification in webmaster tools that the site had been hacked. So an upgrade to a later version of Joomla was required and we implemented SEF URLs at the same time. This fixed the hacking problem, we now had SEF url's, fixed a lot of duplicate content and added new titles and descriptions. Problem is that after a couple of months things aren't really improving. The site is still ranking for adobe and microsoft and a lot of other rubbish and the urls like http://domain.com/index.php?vc=427&Buy_Pinnacle_Studio_14_Ultimate are still sending visitors but to the home page as are a lot of the old redundant urls with parameters in them. I think it is default behavior for a lot of CMS systems to ignore parameters it doesn't recognise so http://domain.com/index.php?vc=427&Buy_Pinnacle_Studio_14_Ultimate displays the home page and gives a 200 response code. My theory is that Google isn't removing these pages from the index because it's getting a 200 response code from old url's and possibly penalizing the site for duplicate content (which don't showing up in moz because there aren't any links on the site to these url's) The index in webmaster tools is showing over 1000 url's indexed when there are only around 300 actual url's. It also shows thousands of url's for each parameter type most of which aren't used. So my question is how to fix this, I don't think 404's or similar are the answer because there are so many and trying to find each combination of parameter would be impossible. Webmaster tools advises not to make changes to parameters but even so I don't think resetting or editing them individually is going to remove them and only change how google indexes them (if anyone knows different please let me know) Appreciate any assistance and also any comments or discussion on this matter. Regards, Ian
Technical SEO | | iragless0 -
How do I fix issue regarding near duplicate pages on website associated to city OR local pages?
I am working on one e-commerce website where we have added 300+ pages to target different local cities in USA. We have added quite different paragraphs on 100+ pages to remove internal duplicate issue and save our website from Panda penalty. You can visit following page to know more about it. And, We have added unique paragraphs on few pages. But, I have big concerns with other elements which are available on page like Banner Gallery, Front Banner, Tool and few other attributes which are commonly available on each pages exclude 4 to 5 sentence paragraph. I have compiled one XML sitemap with all local pages and submitted to Google webmaster tools since 1st June 2013. But, I can see only 1 indexed page by Google on Google webmaster tools. http://www.bannerbuzz.com/local http://www.bannerbuzz.com/local/US/Alabama/Vinyl-Banners http://www.bannerbuzz.com/local/MO/Kansas-City/Vinyl-Banners and so on... Can anyone suggest me best solution for it?
Technical SEO | | CommercePundit0 -
Geo Domains & SEO Issues
Hi Is there issues to duplicate content on geo specific domains. For example we have a client with a .co.uk site who wants to create a .ie website for the local market, rather than sending them to the UK site. The content would be duplicated with limited customization. What issues would you for-see and how could they be overcome? Thank you
Technical SEO | | RadicalMedia0 -
301 issue in IE9
My development team recently discovered an issue with 301 redirects caching in IE9. They did some research and found the situation was very complicated so their solution was to use 302s and no longer use 301s. As a temporary solution to a few URLs I was okay with this, but we have a site redesign launching in a few months and I am quite worried if we have to do all of our redirects as 302s. Has anyone else had this issue with IE9 and 301s. I could use any advice on how to overcome this issue. Thanks!
Technical SEO | | SEI0 -
Https enabled site with seo issues
Hello, Is there a problem with seo bots etc to crawl and rank my wesbite well if the entire site is https enabled? We have a sign in button which results on the next page being https along with the main homepage and all other pages are https enabled. Any major setbacks to the seo strategies? How do I overcome these issues?
Technical SEO | | shanky10 -
What does it mean by 'blocked by Meta Robot'? How do I fix this?
When i get my crawl diagnostics, I am getting a blocked by Meta Robot, which means that my page is not being indexed in the search engines... obviously this is a major issue for organic traffic!!! What does it actually mean, and how can i fix it?
Technical SEO | | rolls1230 -
Search engines have been blocked by robots.txt., how do I find and fix it?
My client site royaloakshomesfl.com is coming up in my dashboard as having Search engines have been blocked by robots.txt, only I have no idea where to find it and fix the problem. Please help! I do have access to webmaster tools and this site is a WP site, if that helps.
Technical SEO | | LeslieVS0