What is the best seo software?
-
This question is in relation to doing site audits and creating branded reports for clients. Do seo agencies create there own software or do you use one that is accessible for all?
Also what do you think is the best general seo software?
-
I would prefer SEOMOZ for their MozRank and Competitor Analysis.
Other tools i have used are MajesticSEO and SEOGix.
Regards,
Infant Raj
-
Just had a look at Power Mapper Sitesort and seems good. Would be interested to see what people think of this?
A one off cost of £239 look good value for unlimited usage...
-
Do you send out your seomoz reports to your clients?
-
HI Dan
Thanks for these recommendations I will check them out.
It would be good to hear your other thoughts you have about seo in general.
How you get your data and the techniques you use to help your clients
Thanks again
-
Maybe I am wrong, but I think with SEmoz, the $199 monthly accunt will let you do what you like...
-
Hi
Someone recently showed me Power Mapper Sitesort. I have not personally used it. But he ran a full audit right in front of me, and it looks pretty powerful. Its $150-$500 though.
In general I love SEOmoz's suite of software. For an audit you may want to try their Crawl test
And real nice to have for an audit is Screaming Frog. Free up to 500 pages, then you have to pay for larger sites.
Depending on the site, I generally use a mix of tools to pull together the info needed, and get it into a mix of excel, word or powerpoint, depending on how its going to be presented.
-Dan
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Re-writing articles, how does this effect SEO?
Hi there, I noticed a competitor had re-created a couple of articles I wrote, but they made reference back to me in the article, can in anyway possible myself or their site be penalised for duplicate content? The article isn't completely copied word for word, however it is definitely shows some similarity. Any advice around this? Thank you in advance
Industry News | | edward-may0 -
Best Practices for Getting Shares, Likes, Etc on Social Media?
I wanted to know what the best practices are for getting people to re-tweet, like, share, and google plus your website. What has worked best for all of you, and what has worked not so well? Thanks!
Industry News | | OOMDODigital0 -
Looking For A reliable Japanese Based SEO Agency
Hey everyone, my company is looking for a reliable SEO agency to help with our Japanese site. I was wondering if anyone had any possible positive experiences with one? Carl
Industry News | | geekdesign0 -
Redirecting International ccTLD affect SEO?
I've watched Rand's Whiteboard Friday video on How to Host and Where to Host for International Websites and read Gianluca's article on International SEO; however, I am still unable to find an answer or solution to this. ABC Company is a US based company with the website www.abcexample.com and plans to target people in Hong Kong. Therefore, they purchase .hk ccTLD. However, with the time restraint and budget limit, ABC Company is unable to create a completely new website under .hk ccTLD. Scenario 1: ABC Company plans to create different pages for each service offered in HK and 302 redirect .hk ccTLD blank service page to www.abcexample.com's same service page. Quesiton 1: Will this have the same benefit of creating a .hk website? Although ABC Company is redirecting, does the .hk ccTLD still tell Serach Engines that we are targeting people in Hong Kong? Will this solution harm or help with SEO with other search engines in Hong Kong? Scenario 2: ABC creates a /hk subdirectory under www.abcexample.com and have exactly same content in the same language as www.abcexample.com but use rel="alternate" hreflang="x" and use Webmaster to Geotarget Hong Kong. Question 2: By using rel="alternate" hreflang="x" and geotarget, will this avoid duplicate content if ABC Company has exact content in the /hk folder? Please let me know if my questions and scenarios are unclear. I understand that these are not best practices. Please advise the best way to work around this. THANK YOU VERY MUCH!
Industry News | | TommyTan0 -
Chinese SEO specialists
Wondered if any of you could recommend a very white-hat and trustworthy Chinese SEO specialist who I can partner with? Thanks, Luke
Industry News | | McTaggart0 -
Build a site, do SEO work on it and sell it?
Does anybody do this? With success? I keep finding industries right here in my local area (concrete work, home security, painting) that have 4-5 local companies that are competing and NONE of them are doing even the most BASIC items to seo their site or capitalize on ANYTHING online. I could pick 7-8 of these industries and have somebody who works for me spend a couple hours a week on each building links and writing a half way interesting blog post, etc. and once they rank higher than most of the competition sell em for 2-3 grand I bet, especially since I can prove how much traffic they are getting. Thoughts? Thanks for weighing in. Matthew
Industry News | | Mrupp440 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690 -
SEO hosting (EU/Spanish IP's) any1?
Hello all, I'm currently trying to find a great hosting company which can offer a fast and stable service and more important, with the possibility of geolocalizing IP's of Europe and concretely for Spain. I've found Ixwebhosting but they only offer US Ip's, as their datacenters are there. I don't know if it's the same thing outside spain, but here the hosting companies only use to have a single "C-class" range so even buying them different IP's doesn't help that much, and it isn't cheap (2€/month/ip...). So if you know someone offering what I'm looking for (or really close to it) drop a line, I'll appreciate your help! 🙂
Industry News | | PabloGV0