I LOVE the notion of stars instead of "guru". But as I am now a lowly aspirant, it will be embarrassing if I only have like a half star or something. NO disrespect to aspirants. We were all lowly at one point.
Best posts made by AlanBleiweiss
-
RE: MOZ TEAM - your system done broke my status as a guru
-
RE: How to Not Scrap Content, but still Being a Hub
I honestly can't offer any short term suggestions. It's a big challenge to know what the best short term path is. Ultimately, you'll need to remove all the scraped content. Do that without replacing it and in the short term, you won't see any gains, though you may even see some short term losses as it's possible you're not being purely penalized.
-
RE: One site or five sites for geo targeted industry
"I shouldn't be telling this stuff."
LOL EGOL you know damn well that if weren't telling it, me or someone else would. Sure, we're fools for giving this info out for free when we could be charging thousands.
Yet you also know how enriching it is to give back what was freely given to us, or that we earned the hard way and now have within us to want to help just for its own sake.
And please - don't discount the true appreciation the rest of us who help out when we can here have for your contributions. You carry a lot of the burden in the Pro Q&A system.
-
RE: Page Rank and offline sites
In addition to Steve having pointed out that the PR you see is totally invalid, and not truly reflective of your real situation, I'd also say that if it's only been back online a couple days, it's way too early to even begin to determine how the site holds up. Nowadays Google assigns an initial estimated ranking value to each page on the site, and the site overall, but then other algorithms are run over the course of days, if not weeks. These provide additional input to either confirm, deny or otherwise cause a modification to that initial assessment.
Then there's the reality that in all that time, if other sites that compete for the same phrases undergo changes themselves in that general time-frame, that too would cause shifting.
Personally I don't put too much weight in ANY "ranking" data. Whether it's MozRank, Keyword search results ranking, or any other kind. All of it's just a general guide.
The only thing that truly matters nowadays for the vast majority of situations is - am I getting a volume of organic search traffic that suits my needs, goals? And from there, is that traffic highly relevant, to the point where those visitors become conversions - meaning how many of them take that next step after arriving that I am seeking? Such as filling out specific forms, or making specific purchases, or whatever the metric is.
that data, over time, is what matters way more than ranking numbers.
Just my experience.
-
RE: Could a sitewide footer EXACT MATCH anchor text link hurt or potentially penalize a site?
I agree Geoff - home should be home not "keyword phrase", but you can use keywords on other footer nav links if it makes sense from a user perspective, and again, purely within the site.
-
RE: Maintaining semi-related keyword groups
Hi Mike
I'd definitely not pollute / dilute the focus of Y by keying in on X for that report. Sure, you can reference X in the report, or in linking to it, but that should be secondary only... Let this be an opportunity to build the strength of Y.
-
RE: Service Keyword in URL - too much?
whether you do retail-pest-control or retail/pest-control - either is acceptable and as long as the sequence ordering is consistent you will achieve the same results.
So they should all be industry-service or service-industry.
-
RE: Site is showing forwarded /301 to another website
I understand Muhammed. Seeing a 301 even though you've changed things can be quite a concern. The most important thing is that your site is listed in Google, and clicking the link from there does go to the right page, and I've confirmed that it's properly set.
Quite often SEO requires patience. Something I personally have difficulty with
-
RE: Finding and Removing bad backlinks
Evaluating links is a very time-consuming process. You need to be able to look for "patterns" as a primary task IF you need to worry about links.
HOWEVER
I will also say this - your on-site SEO is suffering and just as likely or even MORE likely to be your primary problem. Why? Because you have not stated that you received a notice from Google informing you that your site was flagged for bad links. If you did NOT get such a notice, while a poor overall link profile can certainly contribute to a generally declining ranking footprint, it's less likely to be the PRIMARY concern.
For example: Your "Accessories and batteries" category has a terrible topical focus. The page Title doesn't mention what they're accessories or batteries for. Which means from the very first point of reference on-site, that page fails to communicate the refined focus of the category. Accessories could be about ANYTHING. And so could batteries.
Then, on that page, the header text "Accessories and Batteries" neither includes that topical clarification, nor is it even a proper "h1" header tag. There's no descriptive paragraph based content on the page reinforcing and strengthening that topical focus. Your Canonical tag is NOT SEO best practices for pagination in 2012, and thus that results in massive amounts of content within a category not properly being identified to further reinforce topical authority. (You should instead be using rel-next/rel-prev and NOT using canonicalization on paginated content, every page title should be unique, and every page within a set should be properly reinforced with it's own h1 tag).
You're not even close to having enough depth of content on product pages (one sentence for the "detailed description), so with all the "related" . product content, sidebar navigation and other "off-topic" content, there's a lot of content on your site deemed "thin" content.
You have SEVERE page speed problems, a very serious SEO factor in 2012. (tools.pingdom.com reported a 9.3 second load time for the home page and URIValet.com reported 15 seconds).
I haven't even begun to scratch the surface here, because you have a SERIOUS on-site SEO problem that you've apparently either failed to understand or chosen to ignore in this question, which indicates there could be MANY more problems on the site.
Heck - several "minor" template fixes alone could boost your SEO, though if you really want to win, you'd be wise to really address all the high priority factors on-site.
-
RE: Max # of recommended links per page?
Jonathan,
The "rule" used to be 100 links to a page based on Google having included that in their guidelines. They've since removed that numeric value without replacing it with another number. What I find in my large client sites where there's hundreds or thousands of products in a category is the pagination method. The key is to ensure to append each page's Title, URL and h1 with "Page X".
This is best simply because it helps ensure Google discovers all the products and properly credits them to the core category. By trying to force all of the products onto a single page, and using any method that hides most initially for usability, you introduce the possibility that not all those products will be discovered, no matter how much Google does a "good" job at discovering links inside CSS or JavaScript. In reality, their system is far from perfect, and with all that added code, the possibility exists that you cause crawl problems due to imperfect code.
-
RE: Service Keyword in URL - too much?
the number of directories is pretty much illusionary - it's how many clicks to get to something that matters.
That's the key. It ultimately depends on how many case studies you're dealing with as to how you link to them.
Here's an example
Cases is a top level site-wide link.
On the Cases page, there's a description of each service, and a link within that description to that service's page.
Then on that service page, there's a brief snippet for each case study, where you group them on that page by industry type.
That's three clicks down to the individual case study. And in that scenario, you can go with the URL syntax I previously suggested.
So while the "folder structure" "appears" to be four layers deep
case-studies/pest-control/retail/company-name/
The linking methods above are only three deep. So you're totally within SEO best practices.
-
RE: On-Page Keyword Optimization Question
Yes, they give the content area the most weight in regard to individual page topical focus, however search engines do evaluate every word on a page, including content within the source view that visitors don't see. This is why having too much content in header, sidebar and footer areas, or too much code at the source view level causes topical confusion / topical dilution and is considered during the duplicate content evaluation process as well.
The best I can offer in regard to how often a phrase should appear on a page is "does this feel like I repeat this phrase too much?" If you've got the same phrase repeated fifteen times just in the content area, there should be a valid reason other than just SEO reasoning for that. And a LOT of text around those.
-
RE: Finding and Removing bad backlinks
Ah okay - that notice is definitely a factor then and an important consideration not initially mentioned. So as long as you have someone else working on the other issues described then we can focus on the patterns concept I initially mentioned.
Several things that stand out when I'm reviewing links on a mass scale.I prefer to look at links grouped by domain in the first pass to help see these patterns.
1. Page titles of pages sending links. Quite often, they're titles that blatantly scream junk/low-quality or irrelevant to any topic your site is about, or even link-partnerships... or even outright mention SEO.
2. Domain names/URLs of pages sending links. Same concept - they can quite often obviously communicate that they're junk, irrelevant, or blatantly specifically sites for SEO or links.
3. Anchor Text - if you group by anchor text as a next pass, look for links where the anchor text is exact match keywords and then look at the page title of that linking page and it's domain name. Patterns can be spotted of low quality. If needed, you can click over to a URL and just look at the page that link is coming from.
4. After all that process, as you have marked links as being bad, regroup them by domain. At that point you will likely still need to go through remaining links and go to at least one link from each domain to examine the page or just look at the overall domain for quality.
NOTE - the part where you examine a site sending links does require you to be able to know how to spot a bad site already. Like - "Can I trust this site?" "Is this site obviously a fake site?" and other such questions need to be asked and answered.
And if a link is on a good site, is it a forum or blog comment? Is it using an SEO relevant keyword as the person's signature name? Or is it even a legitimate and relevant comment, even if the link isn't using keyword anchors?
There are so many subtle indicators I could add but in reality the best way to go is to dive in and remember to look for patterns. As you spend the time doing this work, patterns become more and more obvious...
-
RE: How to handle Meta Tags on Pagination... page 2,3,4....
I routinely do audits on sites with tens of thousands or hundreds of thousands of pages. We've found good results simply by appending the page info to page Title & Description so:
Designer Handbags | Unique Pocketbooks | Page 3
-
RE: What are the SEO ramifications to forwarding your website to Facebook?
Take any site out of the search engines and see what happens. Whether it's a 301 or a 302 redirect will determine the long term impact. 302s tell the search engines "this page has temporarily moved". Which is all good and fine for a matter of hours, maybe even a couple days. After that, all bets are off.
Users will be disoriented. Expecting to come to an actual site, and being sent to a Facebook page will be disorienting and alienating. It will be annoying and off-putting to users. Many users may very well never come back to the main site afterward.
Intentionally redirecting users who expect to come to a company site is deceptive. And extremely unprofessional. It breaks an expected user experience trust.
Doing it with an entire site could cause all sorts of red flags from a search engine perspective. Because of those very user deception and alienation issues.
Redirecting an entire site to Facebook is very unwise from an SEO perspective, from a user experience perspective, from a brand cohesiveness perspective. Bad. bad. Bad.
-
RE: Why do I see different ranking results for Bing and Yahoo?
Yahoo actually applies their own criteria to the results. It's one of the "features" they claimed set them apart from Bing when the deal was consummated.
-
RE: Press Releases and SEO in 2013
I use PRWeb.com and opt for the premium service ($360 per release - or you can get a Vocus (PRWeb's parent company) business plan if you send out releases frequently - good savings.
I also don't abuse the various options you can set up with each release for SEO but instead choose the ones that make the most sense for proper relevance.
Ideally, if you have the resources and budget to afford it, yes, it's best to work on making real connections with actual editorial desks at various channels and organizations, because that's where the golden nuggets are that will most likely lead to coverage.
-
RE: Include the company/domain name in page titles and urls?
My standard procedure is to have clients include the company name on any "About" related pages, as well as "Contact" related pages, "press", or "media" related pages. It's important to do this so as to ensure the best chance of outranking other sites that have information on the company. I also don't believe it's necessary to do this with the domain name, since the domain name is already in every internal and external link.
-
RE: Site Architecture: Cross Linking vs. Siloing
I agree with Rand's '09 article in general, however there are some things I think take it a bit too far (such as redirecting PDF documents for link juice). If a PDF is truly the most relevant content on a topic, I believe it should be indexed.
The biggest factor is that if we get completely bogged down in this process just for SEO sake, we lose focus on user experience.
It's right up there with page and link sculpting - to me, it's a waste of time and harms user experience. And the time spent going that far is, in my opinion, in 2011 much better spent on other SEO tactics. Not just because Google has changed how they deal with nofollow links.
-
RE: About private questions
Cindy,
Though Moz moderators and staff monitor the Q&A system, (and thus they may answer here) it's probably best if you post this question through the Moz help system to get the fastest attention from someone within the Moz team. I jump to the help system whenever I feel an issue regarding the actual Moz service (as opposed to more general industry questions) yields me the best results...
-
RE: Press Releases and SEO in 2013
Fixed the link so you can read the complete article. The key points:
Press Releases should not be primarily for SEO - the link value is minimal at best except in rare situations.
They do offer, however, many other valuable benefits, even if they are "online only", because they are a perfectly accepted form of communication as part of a comprehensive marketing mix.
-
RE: Is 1 design better than another?
Joe,
I've done extensive SEO work and numerous audits in the real estate market. This is a situation where you're comparing apples to Honeycrisp apples. (yeah, apples to oranges wasn't appropriate for a Minnesota site )
Consider the shear volume of search and number of sites competing for the very generic "real estate" related phrases. Then do the same for the much more refined focus "short sale" related phrases.
JoeAndCindy.com is competing in a much more difficult market online, and it's trying to target many more keyword variations.
-
RE: Moving wordpress to main website - errors galore
Ryan,
This is what happens here as you well know. We're not being paid for hours of review, evaluation and consideration. So sometimes we don't get that "let me sit on this for a couple days" opportunity to formulate a well crafted, re-arranged response...
-
RE: SEOmoz Dashboard Report: Crawl Diagnostic Summary
Gemma,
There could be serveral issues causing this. It's why I always cross-reference error counts with what Google Webmaster Tools reports. If both GWT and the Moz reports line up within a ballpark range, I'd definitely suggest looking into it, and if they don't and GWT reports a much lower volume consistently over time, you will have saved a lot of time, grief and energy. So start with that.
-
RE: Would reviews being served to a search engine user agent through a noscript tag (but not shown for other user types) be considered cloaking?
Ah to have 100% guarantees for anything related to SEO.
Alas, that's not the world we live in. However, we can apply critical thinking to each choice and with that, we are more likely to be safe from the wrath of Google.
SO - for this question let's consider the following:
A "Noscript" version of a site is designed first and foremost for people who have scripts turned off, including those who have browsers set up for either security reasons or for visual impairment needs.
So if you provide content within a noscript block that essentially mirrors what visitors get when scripts are turned on, you are not likely in violation of any Google cloaking policy.
Cloaking comes into play when you generate content purely for Googlebot exclusively or Googlebot and Bingbot.
So if the content you are provided via that zip file (which I assume you then need to manually cut and paste into the noscript portion of the code) is pure content and not over-optimized, you can proceed with confidence that you'll be okay.
Where I DO have concern is this:
"The daily snapshot files contain UGC with SEO-friendly markup tags." (emphasis mine). Exactly what do they mean by that specific wording? That's the concern point. Are they referring to proper structured markup for reviews, from Schema.org or at the very least RDFa reviews markup? If not, that would be problematic because only proper "reviews" specific structured markup should wrap around reviews content.
-
RE: How would you deal with eCommerce sorts?
I always recommend clients implement either noindex/follow on sort methods, or block sorting altogether (the first choice being preferred). If there are specific sort methods that consistently provide valuable conversions, these can be considered to be set up as a separate "evergreen" link on the site, but where you would need to add unique content to the page - enough to ensure it reduces (as much as possible) the duplicate content factor.
-
RE: Site Architecture: Cross Linking vs. Siloing
The "nearby hotels to consider" feature is a user thing. It may or may not pass quality page rank.
In some cases, that extra link could dilute the topical focus / strength of the page it's on.
So if I get to resort X's page, and there's a link to "nearby hotels", there's an implied relationship. Good for users. But for SEO, sure it's related stuff, yet maybe not laser focus related.
Another example is blog posts that end with a following box "related articles" and that box contains three or five links to other articles. Maybe they're highly related, maybe loosely. If they're loosely related, sure it MIGHT be good to help users. Yet it probably dilutes this article's topical focus.
-
RE: 1 week has passed: Crawled pages still N/A
There used to be a separate SEOMoz help forum but it looks like that's not available anymore (at least at the moment). Go to the SEOmoz help page and click on the red "Contact Our Help Team" button and ask if they can look into it or whether it's just a time issue.
-
RE: Should you include keywords in your domain name to rank well on Google Places?
Adding to what Michael said, I will also say that keyword infused domain names increase the potential for the site being more rigorously held up to the Exact Match Domain algorithm system which means your site will need to be even more trustworthy, overall to avoid getting penalized for attempting to artificially rank for that or related phrases.
-
RE: Shall Google index a search result?
Georg,
What I communicate as "best practices" to clients is to noindex/follow all on-site search, however you can look at your analytics to determine if you think you're getting enough value out of the current system or if wiping that out will help with long term refinement of your content. The concept for not having those indexed is many less pages would be given much more strength and weight long-term. . Unfortunately there's no exact method to determine if this will be the case in your unique situation.
Given that your search results provide depth of content (not just a bunch of links, but actual, relevant text linking to details pages, I'd be curious to see if any of your site, or specific phrases you were previously found for that led to those search pages were hit by Panda.
-
RE: Site Architecture: Cross Linking vs. Siloing
The slides will be going up at some point in the next few days. And I'll have a follow-up post that includes the notes for each slide. In the mean time, I did an article on Search Marketing Wisdom yesterday directly related to the last slide in that deck.
-
RE: Do you pay much attention to Bing and Yahoo?
Yes - sign up for the Bing Webmaster Tools service - similar to Google Webmaster Tools. If there are issues or problems Bing encounters that's where you'll get reports on them.
And if you are lacking regarding inbound links pointing to inner pages, you'll see gains in Google and Bing by getting some. Bing likes to see exact match anchor text links that come from other sites already ranking well in the Bing eco-system. So do some research when finding inbound link sources - to see how they're doing in Bing.
-
RE: 20-30% of our ecommerce categories contain no extra content, could this be a problem
If a category page has almost no content (other than photos and product names), then that's a potential "thin content" issue, though the way your question is worded, I'm not confident my interpretation is actually what you meant by "no content beyond".
If product names don't reference the category name, and if there's a lack of any descriptive content on the category page, that's likely even more of a problem - thin content and lack of topical reinforcement of the category itself.
A general rule (barring other issues or considerations) is to have at least a couple paragraphs of unique, descriptive paragraph based text that reinforces the topical focus of each category page. There are numerous ways to split that content out across a category page, and in highly competitive categories, more content may be needed if not enough products exist in the category.
Other factors that can help mitigate this to a certain degree include (but aren't necessarily limited to):
- hierarchical URL structure (nested URLs so product detail pages are seen at the URL as being "beneath" their category
- Proper nested breadcrumbs to reinforce that hierarchical structure
- Strong internal linking a) within categories this would include pagination code (rel-next/rel-prev). b) outside a category this would include links and highly refined relevant content elsewhere on the page linking to the category page.
-
RE: Duplicate Content
If you won't noindex your category pages, my best suggestion is to add unique content to the home page - split out the content so it shows the X most recent article titles & snippets, and separately (to the side of it or above or below it) have unique home page only content - a good couple paragraphs of it should help.
-
RE: Site Architecture: Cross Linking vs. Siloing
just to clarify regarding my input - my perspective is based on my experience with client sites on all scales, small, medium, large and mega sites.
To me it's more important to see how things work on our own sites and evolve them over time as compared to purely looking for what others do or say as it's own reason for taking action.
-
RE: Tool recommendation for Page Depth?
Screaming Frog is my go-to crawler. One of the many data points it provides is page level. Level 0 is home page, level 1 is one-click from the start, etc...
It's an invaluable tool for anyone wanting to check on a vast range of crawl efficiency and SEO factors.
-
RE: Keywords and the role of 'in', 'for', 'to' and plurals
At least part of the answer lies in your competitive landscape. If you properly optimize just a couple variations as primary phrases, add in other variations as secondary phrases to include within the content, get inbound links spread across both the primary and secondary, you should show up for several if not all of the variations you've listed.
Where the competitive landscape factor comes in is this. If you choose options 1 & 2 as your primary, then 3, 4 and 5 as your secondary, and any competitor targets 3 & 4 as their primary and does full on optimization, they'll likely rank higher for those. But that's only if all other factors are equal.
The best way to deal with this and not end up in a vicious cycle, is to leverage PPC ads on the full spread while you focus SEO as I've suggested.
If the organic competition is weak, you won't have as much of a battle.
Oh - I forgot to mention that I know this because I've worked on U.S. based rental sites - one of the largest regional rental sites, and one of the top two national sites.
-
RE: Duplicate product urls
And to add to Matt's reply, not all search engines recognize canonical tags, so if it is a dupe content issue, then 301s are vital.
-
RE: Site Architecture: Cross Linking vs. Siloing
If I have a category California Hotels, sub-category San Francisco Hotels, then having links in a sub-navigation bar to each (if there's only a handful), each of those links reinforces the strength of the top level Hotels, 2nd level California, and third level San Francisco related phrases. They all support each other.
If, on the other hand, I have a link to "nearby hotels", that implies I'm going from a single hotel details page to a uniquely filtered "geo" category page that shows hotels based on some criteria - it might be all San Francisco, or all within a distance radius, or all within a zip code radius.
Even if it's all other hotels in San Francisco, it's not a link pointing to another (or several) same-level page(s). It's pointing one layer higher.
That's a filter more than a properly constructed category drill-down. And it implies that the page I'm on will NOT be listed on that target of the "nearby" link.
-
RE: Competitor outranking us despite all SEO metrics in our favour
Lou,
"I just wanted to throw a few factors out there in order to encourage a response like yours - packed full of useful next steps for me to evalaute this further."
THAT is priceless
Pagination:
Loading all content on one page and using a "more" button to "reveal" it, is not a best practice. Individual pages need to exist for individual sub-topic based content. This is especially true since it now appears that Google, while indexing content initially hidden to users, is likely giving less value to that hidden content than content immediately seen.
Pagination is important IF it is executed properly. If you have tens of thousands of results in paginated lists, is that one paginated group, or are they split out into separate groups based on similarity of content? If it's all just one massive group, that's likely another problem to look into, since pagination is meant to be used to say "these pages all contain links to other content where the entire group comprises very similar content around one primary topic".
Internal linking should always point more to main category page destinations than individual pieces of content. It would be unnatural from a usability perspective to link more to individual pieces of content, and thus it would be bad for SEO.
5,000 or so average crawl errors - what is causing those? Are they 404s? Were they previously valid pages? If so, those typically need to not generate 404 but instead be a direct 301 to a highly relevant live page (and where internal links within the site are updated accordingly).
So many more issues to consider...
-
RE: How do you limit the number of keywords that will be researched
hahaha wLoudogg - don't even suggest that - I've been offered such nonsense plenty of times. Never makes it any more worth the insanity
-
RE: See screenshot: Is this an example of Canonical issue or am I making an error in judgement?
I wouldn't even pause to think. I'd just immediately do the 301 redirect. The fact that both versions are reporting as many links as they do is an indicator of serious link dilution.
-
RE: Site Architecture: Cross Linking vs. Siloing
Having all listed and linked is ideal for SEO, however you rapidly cross into usability problems if there are more than a handful. (Would you want 50 or hundred links in a sidebar nav? ) When a site is so big that there are more than a handful that could be linked from that sidebar, it's actually best practice to NOT have any others linked from the sidebar, else you confuse users even more (listing only some, but not all). User Experience is paramount when making these decisions. Even at the expense of SEO in some cases. And if that happens, other tactics need to be employed. Like having a separate, dedicated funnel for "featured properties". Which requires even more unique content in that funnel. But it at least boosts the ranking value for those properties included.
-
RE: Is Creating a Lot of Content A Bad SEO Strategy?
I think it all comes down to the quality of the content, the ease of readability, and the ability to not diverge too far from the primary topic.
I personally find pages that are endless content, such as blog indexes that load entire articles on the page, to be quite annoying. In that scenario, helping SEO is outweighed by harming user experience. And if the sub-topics wander too far, it can dilute the primary topical focus of the page.
So with proper planning, and user experience considerations, sure, it can be done. Heck, there's a common belief that short blog articles are better than long ones. Yet some of my best, most read, most linked-to blog articles have gone on seemingly forever.
On the flip side, my "Anatomy of an SEO Audit" articles were each strong enough on their own that it was best to split it out into four pieces. Not only did it make readability a bit more reasonable, it gave me three additional "new content" opportunities, and I got to link across all of them by the 4th article, Yet two more valid SEO factors to consider.
-
RE: Google Search Volume Disparities
mreisbeck
In the Google Keyword Tool screen, below the "categories" choices, there's an option box on the left sidebar to choose broad, exact, phrase or a combination of those.
That being said, every situation is unique - so what Google reports as a low volume exact match may be highly valuable if the majority of people searching for that phrase do so in broad phrases or "long tail" phrases. So don't be so quick to completely discount a phrase just from that data. What do your visitor conversion statistics tell you? Call to action and conversion tracking data is vital in helping make the best decisions in this kind of situation.
-
RE: How long should anchor text be? Best practice for anchor text length?
In reality, there is not specific "right" length of anchor text. Search engines fully expect everything from single word to several word anchors, a mix of keyword specific, brand, and generic random text.
In an ideal world, it should come down to "what makes the most sense here from a user perspective". In the case of the example you point out, it looks odd, but not from a spammy perspective - you don't get seo value (perceived or otherwise) from including "click here to view" in the link.
Instead, it is less than ideal from a user experience perspective. It's actually a failed marketing attempt at motivating people to click on the links.
-
RE: Site Architecture: Cross Linking vs. Siloing
Etsy's got a good structure with their category and sub-category sidebar that balances SEO and user experience. note though that when you get deep into the individual Etsy stores, that's gone, because it would dilute the individual store owner's account focus and distract users.
-
RE: Best way to avoid duplicate content issues here.
Actually Google does read the text in some images. It's far from perfect, however it does happen. The key to success, whatever way you go about it, is to provide more unique commentary for each section you quote than the quoted content itself. It's important both as an SEO issue as well as a protection within "fair use" laws (at least here in the U.S., anyhow).
Also if you're doing it in a blog, use the "blockquote" feature if you're not using images. And either way, be sure to link to the original source toward the opening of the article.
-
RE: Should we change our site domain name to include our keyword?
I'd like to offer recommendations where the existing site could possibly just have an SEO problem that won't go away just by getting the new domain value, however when I went to rocketproblems.com just now, I got a "server not found" message.
So without seeing your site, I'll offer insight on what to consider doing if you do the switch.
Exact match or partial match domains do offer value. They are, however, on Google's radar for possibly getting less value than they do now, though Matt Cutts' exact language in a video earlier this year was they're considering a minor / slight reduction in value.
I wouldn't go by the competitors visible actions. Just one month is not long enough to determine value of such a transfer. It could easily take a couple months or longer to rebound and build the proper new trust factors that the new domain name will require, including links to the new domain.
Links to the existing site should be examined to find out which ones you can request being changed to the new domain name as well, since 301 redirects (vital to ensure these are implemented across the board and tested for verification) do take a hit on passing value.
I would also recommend a full press release through PRWeb.com or PRNewswire.com upon change to the new domain, as well as a social media campaign to promote the new domain.
So if you are prepared to wait out the potential drop over an extended period of time, do a full-court press.