How to avoid Sandbox?
-
How to avoid Sandbox?
-
What is Sandbox? In order to avoid something like Sandbox, one should know very well what Sandbox is. But, nobody knows if Sandbox does exist, so let's just focus on the main problem here: How do I get my pages indexed? I have tried over years a lot of techniques, but I found only one that seems to work. If your site is not dynamic, make it so. Create the sitemap and the feed (I recommend RSS 2.0). Put your sitemap in your robots.txt. (last line, like this: Sitemap: http://www.yourdomainname.com/sitemap.xml). Submit sitemap to Sitemaps section in your Webmaster Tools' account. Submit your RSS feed to main RSS directories (just google the words, and you'll find plenty of them). Start with FeedBurner, to please Google. Wait a week or so and you'll see that your pages will start appearing in index. Good luck!
-
Google Sandbox is a debated topic from 2004 and 2005 that has never been confirmed. You shouldn't concern yourself with it too much. Also, the concept of Sandbox would only temporarily penalize new domains for the first few months. If you are worried about being penalized either temporarily or permanantely, there are a couple things you can always do:
1. Create great content
2. Used aged domainsIf you concern yourself with making the best site possible and don't worry about making a quick buck, you shouldn't have a problem.
-
We need a bit more info.
I dont believe there is a sandbox as such.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I avoid duplicate brand name in the title serp?
Hello, How can I avoid duplicate brand name in the title serp? For example:
Technical SEO | | jh0sz
In this page: https://www.latam.com/es_cl/ The title setted is: <title>LATAM Airlines en Chile | Sitio Oficial</title>
But in the SERP show: LATAM Airlines en Chile | Sitio Oficial - LATAM.com Can I avoid LATAM.COM at the end of the title? Regards 8J3jEAX1 -
Redesigning client website and will be losing a lot of landing pages. How to avoid tanking search traffic?
We are working with a client who is changing the direction of the company's marketing efforts. The current site includes many (approx 100) pages for each partner they work with (each partner has its own page). The new site will be losing many of these and we want to be sure we don't destroy organic traffic/rankings in the process. These landing pages don't directly garner the most traffic but it will definitely be a big change in the size of the site. Any advice for how to best handle the redesign is appreciated, thanks!
Technical SEO | | KMofOutlier0 -
How different does content need to be to avoid a duplicate content penalty?
I'm implementing landing pages that are optimized for specific keywords. Some of them are substantially the same as another page (perhaps 10-15 words different). Are the landing pages likely to be identified by search engines as duplicate content? How different do two pages need to be to avoid the duplicate penalty?
Technical SEO | | WayneBlankenbeckler0 -
What is the value in Archiving and how can I avoid negative SEO impact?
I have been very busy reducing GWT duplicate content errors on my website, www.heartspm.com, created on a Wordpress platform. Each month, blog entries are being archived and each month is generating a duplicate description by Google. We post 2-3 blog entries per month and they don't really go out of date. Most are not news related butr rather they are nuggets of information on entomology. Do I need to use the archiving feature? Can I turn it off? Should I switch to archive perhaps once per year instead of every month and how is that done? How do I stop Google from creating its' own meta-description, duplicates each month for these archive entries? Should I have the archive as NOINDEX, FOLLOW? I'm not the programmer, but I have some technical know how, so I have a lot of half baked ideas and answers that could use some polishing. Thanks for your help and suggestions. Gerry
Technical SEO | | GerryWeitz0 -
Avoiding duplicate content with national e-commerce products and localized vendors
Hello 'mozzers! For our example purposes, let's say we have a national cog reseller, www.cogexample.com, focusing on B2C cog sales. The website's SEO efforts revolve around keywords with high search volumes -- no long tail keywords here! CogExample.com sells over 35,000 different varieties of cogs online, broken into search engine friendly categories and using both HTML and Meta pagination techniques to ensure adequate deep-linking and indexing of their individual product pages. With their recent fiscal success, CogExample.com has signed 2,500 retailers across the United States to re-sell their cogs. CogExample.com's primary objective is B2C online sales for their highly-sought search terms, ie "green cogs". However, CogExample.com also wants their retailers to show up for local/geo search; ie "seattle green cogs". The geo/location-based retailer's web-content will be delivered from the same database as the primary online store, and thus is very likely to cause duplicate content issues. Questions 1. If the canonical meta tag is used to point the geo-based product to the online primary product, the geo-based product will likely be placed in the supplementary indexed. Is this correct? 2. Given the massive product database (35,000) and retailers (2,500) it is not feasible to re-write 87,500,000 pages of content to sate unique content needs. Is there any way to prevent the duplicate content penalty? 3. Google product feeds will be used to localize content and feed Google's product search. Is this "enough" to garnish sizable amounts of traffic and/or retain SERP ranks?
Technical SEO | | CatalystSEM0 -
How to avoid 404 errors when taking a page off?
So... We are running a blog that was supposed to have great content. Working at SEO for a while, I discovered that is too much keyword stuffing and some SEO shits for wordpress, that was supposed to rank better. In fact. That worked, but I'm not getting the risk of getting slaped by the Google puppy-panda. So we decided to restard our blog from zero and make a better try. So. Every page was already ranking in Google. SEOMoz didn't make the crawl yet, but I'm really sure that the crawlers would say that there is a lot of 404 errors. My question is: can I avoid these errors with some tool in Google Webmasters in sitemaps, or shoud I make some rel=canonicals or 301 redirects. Does Google penalyses me for that? It's kinda obvious for me that the answer is YES. Please, help 😉
Technical SEO | | ivan.precisodisso0 -
Sandboxed
Hi all, Any help with the following. We built a new site for a customer in June of last year. We then cracked on with the on page and off page SEO. All white had....good quality. 3 months in the site was still not ranking with google and indeed had been sandboxed. All working fine with Bing and Yahoo. We followed all the steps to get recognitions for Google but to no avail. In December we took the drastic step of providing the customer with a completely new site...new content, design, structure etc etc. In Jan we went back and fixed all the external linking sources to link to the new pages on the new site. Now 7 months in.....the site is STILL sandboxed. All still fine with Bing and Yahoo. Thoughts anyone?
Technical SEO | | SEOwins0 -
How can I have pages with media that changes and avoid duplicate content when the text stays the same?
I want to have a page that describes a specific property and/or product. The top part of the page has media options such as video and photos while the bottom includes the description. I know I can set up the media in tabs and have it separated by javascript, but everything resides on one page so there are no duplicate content issues. Example: http://www.worldclassproperties.com/properties/Woodside BUT what if I need to the photos and the videos to have separate URLs so I can link to them individually? For example, for a real estate site blog, I may want to send visitors to the page of the home tour. I don't want to link them to the version of the page with the photos because I want them to arrive on the video portion. Example: http://www.worldclassproperties.com/properties/Woodside?video=1 Is there any way to get around the problem that would result from the duplicate content of the product/property description? I do not have the resources in the budget to make two unique descriptions for every page.
Technical SEO | | WebsightDesign0