Why my site it's not being indexed?
-
Hello....
I got to tell that I feel like a newbie (I am, but know I feel like it)... We were working with a client until january this year, they kept going on their own until september that they contacted us again... Someone on the team that handled things while we were gone, updated it´s robots.txt file to Disallow everything... for maybe 3 weeks before we were back in.... Additionally they were working on a different subdomain, the new version of the site and of course the didn't block the robots on that one. So now the whole site it's been duplicated, even it´s content, the exact same pages exist on the suddomain that was public the same time the other one was blocked.
We came in changes the robots.txt file on both server, resend all the sitemaps, sent our URL on google+... everything the book says... but the site it´s not getting indexed. It's been 5 weeks now and no response what so ever. We were highly positioned on several important keywords and now it's gone.
I now you guys can help, any advice will be highly appreciated.
thanks
Dan
-
Thanks for the response.
Point 1 already did...
Point 2 might be a solution,I was kind of blind with the mess so I tried to block that domain.... Will try with redirects now
Any other suggestion it's welcome
Dan
-
(sorry if my response is newbie-ish as well. Not sure what you've tried already but these are basic.)
1. I assume you've setup Google Webmster Tools for both the sub domain and the original domain? Anything in there like trying to fetch both sites as googlebot, blocked URLs, any strange messages or errors? Fetching and resubmitting the entire new site to be indexed? Basically GWMT is your friend in this situation.
2. If you've now blocked the subdomain with robots.txt I think that may delay resolving the situation. It may not be ideal but setting up 301 redirects from the subdomain back to the original domain would be the best course of action and moving the sub domain (I assume for development or staging purposes) to a new subdomain with the proper robots.txt file would be idea.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Is Indexing my 301 Redirects to Other sites
Long story but now i have a few links from my site 301 redirecting to youtube videos or eCommerce stores. They carry a considerable amount of traffic that i benefit from so i can't take them down, and that traffic is people from other websites, so basically i have backlinks from places that i don't own, to my redirect urls (Ex. http://example.com/redirect) My problem is that google is indexing them and doesn't let them go, i have tried blocking that url from robots.txt but google is still indexing it uncrawled, i have also tried allowing google to crawl it and adding noindex from robots.txt, i have tried removing it from GWT but it pops back again after a few days. Any ideas? Thanks!
Intermediate & Advanced SEO | | cuarto7150 -
Proper 301 in Place but Old Site Still Indexed In Google
So i have stumbled across an interesting issue with a new SEO client. They just recently launched a new website and implemented a proper 301 redirect strategy at the page level for the new website domain. What is interesting is that the new website is now indexed in Google BUT the old website domain is also still indexed in Google? I even checked the Google Cached date and it shows the new website with a cache date of today. The redirect strategy has been in place for about 30 days. Any thoughts or suggestions on how to get the old domain un-indexed in Google and get all authority passed to the new website?
Intermediate & Advanced SEO | | kchandler0 -
How do I tell if competitor's links are good?
One strategy I have seen recommended over and over is to look at your competitor's back links and see if any could be relevant for your site and worth pursuing. My question is how do I evaluate a link and not end up pursuing some penalized site? I would guess checking for Google index is a good idea since some of the webmasters may not be aware they are penalized. Is it DA and whether they are indexed alone? Many sites I have seen have DA in the teens but are legitimate in our industry. Should they not be considered due to low DA? Also I see links from directories on many competitor sites. Seems a controversial subject, but assuming the directory is industry specific, is it OK? Thanks in advance!
Intermediate & Advanced SEO | | Chris6610 -
Should we use URL parameters or plain URL's=
Hi, Me and the development team are having a heated discussion about one of the more important thing in life, i.e. URL structures on our site. Let's say we are creating a AirBNB clone, and we want to be found when people search for apartments new york. As we have both have houses and apartments in all cities in the U.S it would make sense for our url to at least include these, so clone.com/Appartments/New-York but the user are also able to filter on price and size. This isn't really relevant for google, and we all agree on clone.com/Apartments/New-York should be canonical for all apartment/New York searches. But how should the url look like for people having a price for max 300$ and 100 sqft? clone.com/Apartments/New-York?price=30&size=100 or (We are using Node.js so no problem) clone.com/Apartments/New-York/Price/30/Size/100 The developers hate url parameters with a vengeance, and think the last version is the preferable one and most user readable, and says that as long we use canonical on everything to clone.com/Apartments/New-York it won't matter for god old google. I think the url parameters are the way to go for two reasons. One is that google might by themselves figure out that the price parameter doesn't matter (https://support.google.com/webmasters/answer/1235687?hl=en) and also it is possible in webmaster tools to actually tell google that you shouldn't worry about a parameter. We have agreed to disagree on this point, and let the wisdom of Moz decide what we ought to do. What do you all think?
Intermediate & Advanced SEO | | Peekabo0 -
After Receiving a "Googlebot can't access your site" would this stop your site from being crawled?
Hi Everyone,
Intermediate & Advanced SEO | | AMA-DataSet
A few weeks ago now I received a "Googlebot can't access your site..... connection failure rate is 7.8%" message from the webmaster tools, I have since fixed the majority of these issues but iv noticed that all page except the main home page now have a page rank of N/A while the home page has a page rank of 5 still. Has this connectivity issues reduced the page ranks to N/A? or is it something else I'm missing? Thanks in advance.0 -
What's your Link Building Tactics?
So my question is: What's your Link Building Tactic. I always have a bit of a problem building links for my websites. Also Do you use some kind of a tool? If yes can you reccomend it?
Intermediate & Advanced SEO | | Angelos_Savvaidis0 -
Google & Bing not indexing a Joomla Site properly....
Can someone explain the following to me please. The background: I launched a new website - new domain with no history. I added the domain to my Bing webmaster tools account, verified the domain and submitted the XML sitemap at the same time. I added the domain to my Google analytics account and link webmaster tools and verified the domain - I was NOT asked to submit the sitemap or anything. The site has only 10 pages. The situation: The site shows up in bing when I search using site:www.domain.com - Pages indexed:- 1 (the home page) The site shows up in google when I search using site:www.domain.com - Pages indexed:- 30 Please note Google found 30 pages - the sitemap and site only has 10 pages - I have found out due to the way the site has been built that there are "hidden" pages i.e. A page displaying half of a page as it is made up using element in Joomla. My questions:- 1. Why does Bing find 1 page and Google find 30 - surely Bing should at least find the 10 pages of the site as it has the sitemap? (I suspect I know the answer but I want other peoples input). 2. Why does Google find these hidden elements - Whats the best way to sort this - controllnig the htaccess or robots.txt OR have the programmer look into how Joomla works more to stop this happening. 3. Any Joomla experts out there had the same experience with "hidden" pages showing when you type site:www.domain.com into Google. I will look forward to your input! 🙂
Intermediate & Advanced SEO | | JohnW-UK0 -
What's the best practise for adding a blog to your site post panda? subdomain or subdirectory???
Should i use a subdomain or a subdirectory? i was going to use a subdirectory however i have been reading a lot of articles on the use of subdomains post panda and the advantages of using them instead of using subdirectories. Thanks Ari
Intermediate & Advanced SEO | | dublinbet0