How can I get back the deleted campaign?
-
I know that SEOmoz says: "if deleted, you will not be able to get it back. So be careful!" from here: http://www.seomoz.org/help/campaign-settings, but we really need it back, and it's urgent! Have anyone ever recovered the deleted campaign? I just send an email to SEOmoz help team, but it's 6 PM now... Can anyone help? thanks!
-
Does this campaign show up in your archived campaigns by any chance at http://pro.seomoz.org/campaigns ? If not, you only have a few more hours before SEOMoz team should be able to check for you.
Good luck. I hope you get it back, although the chances look slim.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I identify technical problems with my website?
I am hoping for your good health. I would appreciate any tips on fixing technical issues on my website. Could anyone please help me to resolve some technical issues on my website? Thanks in advance. Here is my website: Apkarc
Technical SEO | | jjbndjkui880 -
How To Get Rid Of Duplicate Title Tages
Hey everyone, I have a couple of pages that have duplicate title tags, and that's because I have couple of pages that are pretty much the same. I took the lower ranking of the two pages and deleted it. Is that all I have to do? Or do I need to setup a redirect or anything else? Thanks, Ruben
Technical SEO | | KempRugeLawGroup0 -
@moz staff Where does OSE get Facebook Share information?
When using OSE, where does it pull the Facebook data from? Open Graph? Like this? https://graph.facebook.com/http://www.moz.com I am trying to find out because my URLs are coming in with completely different information: https://graph.facebook.com/http://www.discoverhawaiitours.com/to/discovertheroadtohana_21a.html We are using the ShareThis plugin and I think it's not reporting the right info.
Technical SEO | | Francisco_Meza0 -
Used Machines Site - Should I delete sold machines pages?
I´ll start posting in a wordpress blog used machines (excavators and wheel loaders) from a local company. They buy used machines every day and sell machines every day also. When They sold a machine should I: A) Edit the page with a sold machine text and show similar machines? B) Change the category of the post ( to something like sold machines)? C) Make a 301 redirection to one similar machine? D) Keep the page and Make a 301 redirection to one similar machine? E) Delete the page and Make a 301 redirection to one similar machine? F) Keep the page and Make a 301 redirection to top10 machines? G) Delete the page and Make a 301 redirection to top10 machines? F) Any Suggestions are welcome 🙂
Technical SEO | | SeoMartin10 -
Can too many pages hurt crawling and ranking?
Hi, I work for local yellow pages in Belgium, over the last months we introduced a succesfull technique to boost SEO traffic: we have created over 150k of new pages, all targeting specific keywords and all containing unique content, a site architecture to enable google to find these pages through crawling, xml sitemaps, .... All signs (traffic, indexation of xml sitemaps, rankings, ...) are positive. So far so good. We are able to quickly build more unique pages, and I wonder how google will react to this type of "large scale operation": can it hurt crawling and ranking if google notices big volumes of content (unique content)? Please advice
Technical SEO | | TruvoDirectories0 -
Oh no googlebot can not access my robots.txt file
I just receive a n error message from google webmaster Wonder it was something to do with Yoast plugin. Could somebody help me with troubleshooting this? Here's original message Over the last 24 hours, Googlebot encountered 189 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%. Recommended action If the site error rate is 100%: Using a web browser, attempt to access http://www.soobumimphotography.com//robots.txt. If you are able to access it from your browser, then your site may be configured to deny access to googlebot. Check the configuration of your firewall and site to ensure that you are not denying access to googlebot. If your robots.txt is a static page, verify that your web service has proper permissions to access the file. If your robots.txt is dynamically generated, verify that the scripts that generate the robots.txt are properly configured and have permission to run. Check the logs for your website to see if your scripts are failing, and if so attempt to diagnose the cause of the failure. If the site error rate is less than 100%: Using Webmaster Tools, find a day with a high error rate and examine the logs for your web server for that day. Look for errors accessing robots.txt in the logs for that day and fix the causes of those errors. The most likely explanation is that your site is overloaded. Contact your hosting provider and discuss reconfiguring your web server or adding more resources to your website. After you think you've fixed the problem, use Fetch as Google to fetch http://www.soobumimphotography.com//robots.txt to verify that Googlebot can properly access your site.
Technical SEO | | BistosAmerica0 -
How can I get unimportant pages out of Google?
Hi Guys, I have a (newbie) question, untill recently I didn't had my robot.txt written properly so Google indexed around 1900 pages of my site, but only 380 pages are real pages, the rest are all /tag/ or /comment/ pages from my blog. I now have setup the sitemap and the robot.txt properly but how can I get the other pages out of Google? Is there a trick or will it just take a little time for Google to take out the pages? Thanks! Ramon
Technical SEO | | DennisForte0 -
How to use overlays without getting a Google penalty
One of my clients is an email subscriber-led business offering deals that are time sensitive and which expire after a limited, but varied, time period. Each deal is published on its own URL and in order to drive subscriptions to the email, an overlay was implemented that would appear over the individual deal page so that the user was forced to subscribe if they wished to view the details of the deal. Needless to say, this led to the threat of a Google penalty which _appears (fingers crossed) _to have been narrowly avoided as a result of a quick response on our part to remove the offending overlay. What I would like to ask you is whether you have any safe and approved methods for capturing email subscribers without revealing the premium content to users before they subscribe? We are considering the following approaches: First Click Free for Web Search - This is an opt in service by Google which is widely used for this sort of approach and which stipulates that you have to let the user see the first item they click on from the listings, but can put up the subscriber only overlay afterwards. No Index, No follow - if we simply no index, no follow the individual deal pages where the overlay is situated, will this remove the "cloaking offense" and therefore the risk of a penalty? Partial View - If we show one or two paragraphs of text from the deal page with the rest being covered up by the subscribe now lock up, will this still be cloaking? I will write up my first SEOMoz post on this once we have decided on the way forward and monitored the effects, but in the meantime, I welcome any input from you guys.
Technical SEO | | Red_Mud_Rookie0