Macy's, Neiman Marcus, Bergdorf Goodman, Fortunoff, Gap... All have their Brand in page Titles.
Interestingly, Gap uses the combination of Category | Gap | Sales Hook (a hybrid of my suggestion and EGOL's) on many of their pages.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Macy's, Neiman Marcus, Bergdorf Goodman, Fortunoff, Gap... All have their Brand in page Titles.
Interestingly, Gap uses the combination of Category | Gap | Sales Hook (a hybrid of my suggestion and EGOL's) on many of their pages.
You can also add a canonical tag on the old page, using the new preferred page URL in that tag. This could help speed up the Google indexing issue as some have been testing this actually works faster than waiting for Google to change it out based on the 301 redirect.
Also, if there are any links pointing to that page, within your own site, or coming from 3rd party sites, it's a best practice to change them - the ones on your site you can take care of - the off-site inbound links can be more challenging - requiring reaching out to other site owners, who are not always responsive. Yet if you can do this, it helps provide more authority to the new page sooner.
I always recommend to clients that unless they're a world renowned brand, it's important to include the brand name on core information type pages (about, contact, jobs, etc.) as the first part of the page Title, but that it's optional to include on the rest of the site - and if it is included on other pages, it should definitely be at the end of the Title string, after each page's primary topical focus.
As for products and categories, unless you've got a site that's dominating the search results, I always recommend Product Name | Product Category | Optional BrandName
This is vital because you need to build topical relevance for every product - both specific to that product and how that page relates to it's larger category. Imagine having 30 product pages in one category - that's 31 pages that would have the Category emphasized, yet in proper syntactical order for individual page relevance.
Then, as your site becomes truly strong in search results, you can go with EGOL's method.
Always remember that Google only displays the first 70 characters at most from each Title - so look at how Titles would in Google. If you do include the brand at the tail, it's okay if it gets cut off in the Google display - they'll still see it, and users will see it in their browser when on your site - as an additional brand-strengthening aspect of your site's design.
What I've found the reason for this comes down to how the Google system works. Case in point, a client site I have with 25,000 actual pages. They have mass duplicate content issues. When I do a generic site: with the domain, Google shows 50-60,000 pages. If I do an inurl: with a specific URL param, I either get 500,000 or over a million.
Though that's not your exact situation, it can help explain what's happening.
Essentially, if you do a normal site: Google will try its best to provide the content within the site that it shows the world based on "most relevant" content. When you do a refined check, it's naturally going to look for the content that really is most relevant - closest match to that actual parameter.
So if you're seeing more results with the refined process, it means that on any given day, at any given time, when someone does a general search, the Google system will filter out a lot of content that isn't seen as highly valuable for that particular search. So all those extra pages that come up in your refined check - many of them are most likely then evaluated as less than highly valuable / high quality or relevant to most searches.
Even if many are great pages, their system has multiple algorithms that have to be run to assign value. What you are seeing is those processes struggling to sort it all out.
The claim is that they're pretty good at detecting headers, sidebars and footers - and make some accommodation for these. In my experience it's mostly related to links in those areas, though there's some duplicate content consideration.
Having said that, I've also consistently seen where unless you've got a mega site with a lot of other SEO going, that accommodation is not enough to compensate for getting away with almost no actual unique content on important pages. You still need to consider the overall competitive landscape, and you still need to have more than a spit's worth of content. And the less you do in regard to other SEO factors, the more content you absolutely need.
Yes, there is.
It's not based on old-school SEO value though as much as newer reasoning.
Press releases definitely have much more churn than ever - so many sent out by so many companies every day, many of which are junk / very low quality. So any link value you get is going to be at a much lower count than they used to generate, and many will stick around a lot less - shelf life is much shorter.
At the same time however, if you opt for the wider distribution options (such as PRWeb's top two options) you're more likely to get at least some decent sites in the mix (and especially if you go with the top tier release option). This is much more valid if the releases you create only contain one or a couple links in the content. And don't opt to include your web site embedded in their iFrame. It's a terrible usability issue, and offers NO value - and most definitely will not be picked up by high end news feeds.
The other SEO value comes from the fact that those quality sites that do pick them up and post them do provide not just link value but mentions related to branding.
Also be sure to take the time to properly categorize the release in terms of the right industry and regional reach.
On a final note, they're most effective when you send them out on a regular basis - not too often, yet no less than three or four times a year.
And only when you have something that's truly worthy of a legitimate press release.
artversion1,
With all due respect, in my experience, Magento is one of the worst possible solutions for eCommerce from an SEO perspective. It's one of the most difficult CMSs ever created from that perspective. Yes, it can be used. However the handful of times I've seen it in place, I've needed to spend much more time working with developers to "get it right" especially in taking it to a higher level of SEO, than Joomla, Drupal, WordPress...
If it's an open source solution built on a common framework (PHP / Zend comes to mind as just one example) then the cost of the customization could very well be less than trying to adapt an existing plug-in to a new, custom function it wasn't intended to accommodate.
It may, yet it may not be less expensive to go with the off-the-shelf system. I've seen plenty of sites that ended up costing more due to off-the-shelf system limitations and trusting on community created plug-ins to do what they were not intended to do.
Ultimately, I was just providing the alternate considerations that I've personally seen and had to deal with , both in SEO and before that, as a project manager responsible for some of the most powerful sites on earth.
If you truly have a budget for this work, the answer may be "none of the above". I can't tell you how many times I've seen sites built in WordPress, Joomla or Drupal that required high quality user functionality, and in turn then needed the "out of the box" CMS to be customized extensively. Then, six months or a year later, other changes had to be customized. Because that's the evolution of the web. (I've got sixteen years in the business).
If you have the right web developer who can create exactly what you need with an open-source solution yet one that is customized exactly the way you need it, you're not having to rely on developers having to find a plug-in that really doesn't do what you want, but that you're stuck with because the site was built in WordPress, Joomla or Drupal, and customizing costs a lot more than it would otherwise if it was a from-scratch site with that stuff already built in.
So the real process, then becomes :
1. Write a comprehensive and detailed document that explains exactly what you need the site to do, under all the possible scenarios that apply to that unique site in that unique market.
Consider that your site might need to serve multiple markets (for example some visitors might be retail, and some might be wholesale). Get that user experience information into the specification document.
2. Provide that document to three different developers and find out what they would charge for the solution, and if they guarantee in writing that what the document specifies will in fact be included.
3. Make sure you really detail things out. Don't just say "It has to work with SEO". It should be a site that "accommodates current SEO best practices functionality".
Don't worry about the platform of choice. Worry about getting a site that really meets your real needs.
I think it all comes down to the quality of the content, the ease of readability, and the ability to not diverge too far from the primary topic.
I personally find pages that are endless content, such as blog indexes that load entire articles on the page, to be quite annoying. In that scenario, helping SEO is outweighed by harming user experience. And if the sub-topics wander too far, it can dilute the primary topical focus of the page.
So with proper planning, and user experience considerations, sure, it can be done. Heck, there's a common belief that short blog articles are better than long ones. Yet some of my best, most read, most linked-to blog articles have gone on seemingly forever.
On the flip side, my "Anatomy of an SEO Audit" articles were each strong enough on their own that it was best to split it out into four pieces. Not only did it make readability a bit more reasonable, it gave me three additional "new content" opportunities, and I got to link across all of them by the 4th article, Yet two more valid SEO factors to consider.
I'd offer a slightly different perspective.
If you create lots of content that supports a higher level page, many of those supporting pages might very well not ever garner any external links. Yet they very well could offer tremendous value in boosting the primary page's value from an internal linking perspective as well as an inbound link perspective.
For example - if you want to be known as THE AUTHORITY on all things related to widgets, you'd be wise to have many second and even third tier content pages within your site structure. The vast majority, if not all of the inbound links would point to the higher level pages exactly BECAUSE you've got all that depth.
Welcome to the world of competitive tactics based on real or perceived return on investment, and where ethics is a matter personal belief. Some people believe they matter, others claim ethics are irrelevant, because obviously, they will do whatever they can, in attempts to make as much money by any means necessary as they think they can get away with.
It doesn't hold the same value as if every page on oldsite.com were redirected to the proper equivalent page on newsite.com - the long term value would be in links pointing to those redirecting to the new ones, combined with the links pointing to the old site's home page redirecting to the new site's home page.
How much value it gets that will last depends on how many links pointed to oldsite.com's home page. Yet ultimately, if no new signals are generated and newsite.com doesn't build out in depth or spread of keywords, it's going to fall off over time.
Exactly how much time is anybody's guess.
The vast majority of people who do such things do it for the short (whatever that means) term boost, and if there are ads on newsite.com, the clicks on those ads.
Good question as to whether the search engines know about this and if so, how they deal with it. Ultimately they don't say exactly.
In the past couple years I have not heard of anyone doing such things as their primary path to long-term success.
The actual length of time it lasts would definitely depend on the market it's in. Are many other sites actively doing across-the-board SEO - both on-site and off-site? If so, it'd last less time than if the market was relatively quiet.
How long it would take you to bypass newsite.com is also an unknown factor, regardless of method you chose. SEO best practices call for putting in the footwork to do best practices and trusting that over time, you'll see value. And you may or may not overcome newsite.com depending on what they do and the rest of the market.
Ryan,
You added some great additional insight here for Bill to consider. Excellent work on that.
And yes, I agree with you in not being happy that the "edit" link doesn't want to work lately here.
LOL careful Steve - it took me just a couple months of answering questions here to skyrocket from a couple hundred points to "authority" level. Of course, I did have to provide a lot of answers that were given "good answer" status. But if that's all it took to get a sig link here...
Then again - if people really did stick around long enough (even a couple months) and really gave good enough answers to rise that way, I suppose everybody would win. Even if most bailed out after they got their link.
Then again - if they did it for the link, they could then possibly revert to crappy answers, knowing that the sig would be there even with crappy answers.
Yeah - before my head explodes, I'll just stick with the "hope they don't offer it" answer you guys gave.
There is also some discussion within the industry that links of that nature have been so seriously abused by spammers that Google now recognizes such link patterns and either devalues them (doesn't give them the link value to pass along) or may even ding the site where all those signature files are.
It's easy to get confused with terminology. All pages, however, should have high quality, unique, paragraph based content, no matter what you call them.
You have the right idea for organization.
From the home page, there should be links to the top level categories
White Tea
Black Tea
Oolong Tea
British Tea
Then all of your articles having anything to do with White Tea would be linked from within the White Tea section of the site.
So the tree would then look like:
This is, in fact, high quality content organization. So congratulations for having understood the concept.
If it's accessible to members only (and not open to search indexing), I'd even copy some of my past blog articles into it. I've referred to a number of them in the past couple months.
I like this concept a lot. Commonly asked questions can readily be assigned to specific drill-down categories and sub-categories. Not necessarily based on the topics people assign their own questions though. I've seen a lot of questions here assigned to many categories, not all truly accurate (another symptom of people not really having the expertise, though if they did, they wouldn't be asking, I suppose).
Money? Did someone say money?
Actually EGOL something you just responded with caught my eye - "Will the questions be interesting and varied enough to hold the interest of experience people?"
That is a very good question. Even when forums create "sticky" entries, and FAQs, and such, so many people never read them. Which inevitably leads to so many basic questions being asked over and over that it tends to burn many more experienced people out.
Yet perhaps over time enough experienced people will stay, or at least be joined by other experienced people on a regular basis.
All I do know for sure, is that the Moz Pro Q&A is one of the best, if not the best system I've seen to date.
In addition to Steve having pointed out that the PR you see is totally invalid, and not truly reflective of your real situation, I'd also say that if it's only been back online a couple days, it's way too early to even begin to determine how the site holds up. Nowadays Google assigns an initial estimated ranking value to each page on the site, and the site overall, but then other algorithms are run over the course of days, if not weeks. These provide additional input to either confirm, deny or otherwise cause a modification to that initial assessment.
Then there's the reality that in all that time, if other sites that compete for the same phrases undergo changes themselves in that general time-frame, that too would cause shifting.
Personally I don't put too much weight in ANY "ranking" data. Whether it's MozRank, Keyword search results ranking, or any other kind. All of it's just a general guide.
The only thing that truly matters nowadays for the vast majority of situations is - am I getting a volume of organic search traffic that suits my needs, goals? And from there, is that traffic highly relevant, to the point where those visitors become conversions - meaning how many of them take that next step after arriving that I am seeking? Such as filling out specific forms, or making specific purchases, or whatever the metric is.
that data, over time, is what matters way more than ranking numbers.
Just my experience.
Perhaps the better question might be "is this phenomena due to the fact that there are new people entering the industry all the time, or is it just that not enough clear information exists that's readily available?"
Because I think this is the situation - new people are still coming into the industry in droves. This is most likely due to the fact that more and more old-school marketing companies finally realize they need to adapt, more business owners realize they need SEO yet can't necessarily afford to hire a seasoned professional or agency, and also as SEO gets more attention in media, more people will try to join the industry themselves.
Then there's the reality that with the economy not improving dramatically in such a long time, more people look for new ways to generate income as entrepreneurs than previously.
So put another way, to answer your question more directly, I'd say no most likely not
The key issue you wrote about, titles that are too long, relates to how well a page matches the topical focus within the title.
Too many words in a title can cause topical dilution if they're all keyword phrases.
In my experience, have the most important / most relevant content at the front of the title relative to the page topic. So from that regard you're doing it properly. Generally speaking, I teach clients to keep the individual page title to 70 characters. Not to prevent dilution specifically, but to ensure the entire title shows up in the Google search results although Google sometimes overrides your given title for one of their own if they think your title doesn't truly match the page focus.
When a title goes beyond 70 characters, if the extra text is brand focused, it's not a terrible thing. Google will still process the entire title, it's just the whole thing won't show up in the results pages.
Having all the titles appended after the unique forum topic with your forum brand is not 100% ideal in regard to matching the individual page topic, however it's perfectly acceptable from an overall branding perspective.
As for MagentoWebDeveloper and his concern with repetition, there is truth to that, to a certain degree, however it's not as major an impact because you do have each title prepended with the individual page's topical focus.
And the more you do to focus on across-the-board SEO, the less concern that becomes.
1. Detect crawler type. If it's the mobile googlebot, and it comes to the regular site, redirect it to the mobile site. If the regular googlebot comes to the mobile site, redirect it to the regular site. You should also do this as a best practice based on all visitor browser types. It's not cloaking, it's serving up the proper version of the site to the source it belongs with. You can see Matt Cutt's help video on this subject here.
2.Create a mobile sitemap file and submit it to Google and Bing.
If you do this properly, you should not have any problems with duplicate content because they're sophisticated enough to understand that you're not trying to serve up two sets of content to the same visitor type.
Just to clarify and provide an answer a bit more accurate to your question.
1. Whichever you want indexed (the www version or the non www version) you should have 301 Redirect functionality set up pointing to the one you want from the one you don't want on a site-wide basis.
2. In GWT, you can designate that you want them to prefer the www version or the non www version.
3. If you don't already, be sure to have a sitemap.xml file set up on your site, and submit it through Google Webmaster Tools and Bing's Webmaster Tools.
4. Having inbound links helps for ranking however it's not required just for indexing, if you submit the sitemap.xml file. I've had plenty of sites indexed way before any inbound links existed.
5. With both Google and Bing's webmaster tools you can review their data to see if either find any problems in your site.
Duplicate content is one of many hundreds of factors. If you have a very well crafted site, highly optimized, and with a very strong inbound link profile, but only a couple pages (ones that are not highly relevant to your primary topical focus) are duplicate, the potential negative impact on your overall rankings will be minimal.
This is true for most SEO factors. If any single factor has a flaw, but it's not a flaw that applies to the whole site, that single factor is going to have minimum impact on the overall site.
Giving help to others is very rewarding not just through points (although I do enjoy watching mine), so enjoy the experience João !
Just to be sure, I went and gave a thumbs up to your question, then refreshed the page. Your score went up a point. Hang in there, keep participating and it should go up over time.
How big is your site? If it's only a few pages, sure, duplicate content there could have an impact. But in reality, I expect your site is not primarily made up of keyword phrases that either of those pages would be optimized for, and that you have more than a few pages. If so, any "negative" aspect would not be severe.
Having said that, it really is best to just use a robots meta tag set to noindex,follow (my preference instead of blocking completely in the robots.txt file.
if you're on WP, then I definitely recommend the Google XML Sitemaps plug-in. Does all the heavy lifting for you.
Regarding the source ownership factor, I'd also recommend using the Google Source Attribution tag on all your articles.
Your sitemap.xml file has three entries. I recommend automating the URL generation in it to include new content as it's posted, with lastmod set to the date/time of posting. I'd then get that file pinged out to Google at least once a day if you've got new content every day, or even each time a new article is written.
I'd also put a date/time stamp tag in the header using the HTML standard meta tag format for dates
That isn't guaranteed to help, however it has helped some sites I've worked on in the past.
that's a very good question. You may want to post it as a new question so we can get fresh eyes on it. Personally I haven't ever done so many sites at once so I too am curious to find out if others have this specific experience.
If not, you may very well end up being the guinea pig. In that case, I would suggest not doing more than one or two a week.
that's a very good question. You may want to post it as a new question so we can get fresh eyes on it. Personally I haven't ever done so many sites at once so I too am curious to find out if others have this specific experience.
If not, you may very well end up being the gunea pig. In that case, I would suggest not doing more than one or two a week.
If I had the time, I'd like to learn more specifics. Unfortunately I struggle to eek out enough time to just participate here in the Q&A system as it is...
Okay based on that pattern, I've seen similar volatility with some phrases but not others for a client in a highly competitive market, with the volatility lasting years. From what I have been able to ascertain, in this case it was due to the fierce nature of competitive activity for those phrases - literally where X number of competitors have been fighting it out. In that time-span we've bounced back to the top a number of times, only to have those phrases drop again, then rebound.
It depends on how long it will be before the new content is up -if it's a couple days, might as well leave it. If it's weeks or longer, it's best to change them, though honestly making the change one way or another at this point is not necessarily going to move the site in any significant way - it's difficult to judge how much, if at all.
This is much better from the previous over-saturation perspective. Here's my next question though - all those links in the lower part that still remain - they all use "business class CityName" as anchor text.
Except none of them points to a business class specific page. So how do you confirm to Google that "these pages really are about business class services", other than by using that anchor text from that one page?
And to add to Matt's reply, not all search engines recognize canonical tags, so if it is a dupe content issue, then 301s are vital.
When I went to the longer version URL just now, it took me to your "404 not found" page.
If the URL you want to use is the short version, then the long version should have a 301 redirect implemented pointing to the short version.
And the beauty is you have the opportunity to help change that.
Yes. It's only a secondary level aid, and not guaranteed, yet it could help speed up the process of devaluing those pages in Google's internal system. If the system sees those, and cross-references to the robots.txt file it could help.
There's no bulk page request form so you'd need to submit every URL one at a time, and even then it's not a guaranteed way. You could consider gettting a canonical tag on those specific pages that provides a different URL from your blog, such as an appropriate category page, or the blog home page. That could help speed things up, but canonical tags themselves are only "hints" to Google.
Ultimately it's a time and patience thing.
Dave,
It's possible that it was a Panda related hit - From the moment Google implemented Panda, and they got a flood of feedback, they began making minor changes to it, all the way up to and through their global April 11th roll-out.
It could have been many other factors as well. If the site had a serious crawl problem in the days leading up to the drop, that could have triggered the drop. Or that could have triggered falling into Panda.
The end result however, is that it sounds like you're aware of a number of SEO best practices related problems, and whatever the cause, these should be addressed. Any one or more of them could have been a trigger point or added to it.
I always advocate to clients that regardless of the cause, it's important to implement SEO best practices both to help restore drop-off and to prevent future drop-off based on whatever next change search engines make.
Without enough links to the target page (a mix of keyword variations and generic non-keyword specific), the target page doesn't have enough off-site "trust" as compared to other sites when compared head to head.
yet if all you have was links to internal pages, that too would look unnatural, so always have some of your new links point to the home page.
Ramon
There could be several other factors however here's a quick hit finding:
1. Inbound Link anchor text spread
While you've got much more content and many more links, wholesale-flights.com has a much broader variety of inbound link anchor text related to "business class" than you do, including pointing to their home page. (OpenSiteExplorer.org is great for providing insights like this).
2. Internal keyword saturation
You've got an extremely over-saturated usage of keywords in your internal linking (all those links in the bottom third of your home page for example)
By not having enough inbound link quality relative to that specific phrase, and simultaneously having too much use on internal links, you're out of balance.
3. Core architecture flaw
When I go to your Business Class page, the entire top navigation is gone. As though you've made that specific page into more of a PPC landing page than a primary site page.
On that Business Class page, you've got section level navigation specific to Business class, with the top link labeled "home". Again, this communicates that this really isn't a core part of the main site, but some sort of isolated section. This then isolates this section improperly. Then on one of those sub-pages, not only are the top navigation links gone, so are the primary site footer links.
Yes, it's good, and important to maintain that section level navigation, yet it's just as important to communicate that it's a core section of the main site.
4. Section level content.
Other than the initial business class page, none of the other pages in that section have any substantial content to speak of - they're all very weak overall.
People in power like to throw that power around, both to prove to others that they have it, and to remind themselves that they're so amazing.
Since you're the new guy, and have yet to prove your worth, my best suggestion, both from an SEO perspective as well as a life-lesson and general business perspective would be to let the subject drop for the time being. If it's just content on the home page and not the entire site, given the politics of it all, you're probably better off focusing on other tasks so that you can build your worth to management over time.
Maybe one day you'll be able to revisit the concept of why duplicate content is bad for the company, maybe you won't.
It is SEO best practices to be consistent with case use, however upper case and Lower case are interchangeable on some server systems, while they are different/unique on others. If the new system allows for you to use upper and lower case interchangeably, you should not see anything but 200's. If that's true, then you should be okay.