I agree Geoff - home should be home not "keyword phrase", but you can use keywords on other footer nav links if it makes sense from a user perspective, and again, purely within the site.
Posts made by AlanBleiweiss
-
RE: Could a sitewide footer EXACT MATCH anchor text link hurt or potentially penalize a site?
-
RE: Www or not www base url
"It doesn't sound like they are a developer at all if" no kidding! I'm not a developer, yet as a project manager even I knew it would be that straight forward...
-
RE: Trackback/Syndication
I recommend implementation of Google's Rel-Publisher markup - to designate the original publisher. There's at least one WordPress plug-in for this however I can't speak to whether it works or not - only that the Google markup is currently best practices...
Combined with Rel-Author, it's how Google determines content ownership
-
RE: Could a sitewide footer EXACT MATCH anchor text link hurt or potentially penalize a site?
Yes!
Since as long ago as late 2010, I've seen specific sites penalized for abuse of exact match anchor text in footers. For a long time it was strictly a "keyword specific" penalty (de-ranking just for that phrase), however with the latest indications, it's possibly also a contributing factor to the Penguin problem (or another "anchor text specific" update Google made in April)...
Either way, sitewide footer links need to be clean if used at all. So I don't recommend anchor text to a home page. It should instead just be "Home" or something similar...
-
RE: How do I eliminate indexed products?
site:trophycentral.com -www shows all content indexed not within the www subdomain.
-
RE: How do I eliminate indexed products?
This is a sad reality that many business owners face. SEO is a very complex process and unfortunately it's a case of two-fold barriers to success.
On the one hand, not all SEO's really know, or even if they do, don't consider long-term ramifications of what they might be recommending. Many of us in the industry think and act otherwise, however it is a problem nonetheless.
On the other hand, Google constantly changes their rules to a certain extent - as more people look for ways to game the system, what may have been acceptable previously can become unacceptable as Google tries to clean up the mess. It's a vicious circle.
So...
You can get "rid" of indexed pages by blocking them through a robots.txt file - if there are patterns to their URLs. if it's an entire site, you can block the whole site in one line in the robots.txt file. If it's multiple sections of a site, you can block entire sections while leaving other sections open for search indexing. A professional should be tapped to help you with that.
Its important to consider whether pages should be blocked, or instead, redirected to other pages that you want indexed or are indexed that are similar in nature.
Bad link evaluation is a professional process and should not be undertaken lightly. In many cases, site owners will ignore your requests, so it's important to at least get the request process right and to document the process. Again, a professional is needed for that.
And yes, you can submit a request after that's done.
Unfortunately, anyone you task to do the work that would be able to help you will both charge you for their time and cannot guarantee that what is done will be enough. It's another reality of the world we operate in because Google cannot reveal trade secrets to help you know what exactly needs to be done.
-
RE: How long does it take for an article or a page to be listed by google
I'm not a Joomla expert - so you're best bet is to check with someone who is, however there are Joomla extensions you can use to automate the generation of your sitemap so you don't have to manually do it every time.
Which one you use is something I'm not prepared to recommend because I am not up to speed enough on Joomla.
-
RE: IP ranges and matching WHOIS
If Site A is a cash cow that does not need SEO, then I would block the entire site from search engines via robots.txt file. Even on separate hosts, all the links pointing to Site B are a big negative due to the sheer volume, given that there's likely a "bad rap" label associated with SEO on site A.
Duplicate content does not need a "same server" relationship to be a big problem either. All duplicate content is a problem regardless of location.
If a client I represent is doing things that I believe are impeding their success, I personally believe it's important to communicate my concern. However, if they choose to ignore that communication, that's their right to do so.
-
RE: How do I eliminate indexed products?
I checked a few things -you have bigger fish to fry.
Search.trophycentral.com and mobile.trophycentral.com - are either of those causing duplicate content problems on a mass scale? Google has 8k www pages indexed, and three times as many non www pages indexed.
Then looking at your inbound link profile, you've got a huge volume of inbound links from very low quality "SEO only" sites relative to your total inbound link profile. Very serious problem there.
Then you've got another site you apparently own site (StrictlyGifts.com) where you have a footer link with keywords, from every page of that site pointing to the TrophyCentral site.
The list goes on.
The cumulative impact is clearly a case of your site showing all the signs of SEO stuffing and unnatural link building. Even if some of it was unintentional (like the duplicate content conflicts across sub-domains), the pattern is clear.
1. Do all you can to get as many of the bad links removed as possible. Keep a spreadsheet of that effort.
2. Block search.trophycentral.com from search engines.
3. Either redirect mobile.trophycentral.com to the main www site and have the main www site redesigned to be flexible and responsive to different viewing platforms, or get together with a mobile SEO expert to otherwise address that duplicate content issue.
4. Add high quality unique descriptive text based content to all of your primary category and sub-category pages, and even some on every product page.
5. Consider submitting a reconsideration request to Google after you've made real headway on all of that work. Be willing to submit the spreadsheet to show links you've had removed and which you tried to but couldn't. Explain all the other steps you've taken.
You may get a bounce after that or it could just take the next Penguin update if you've done most of that before the next update.
Either way, that's just recommendations that come from only a few minutes looking at your site. There could be many other issues to address.
-
RE: IP ranges and matching WHOIS
The question is this - why would you want to keep Site A given the current insurmountable challenge you describe?
Do you still hope there's some value in it being kept alive?
Do you still hope there's some SEO value or that the site will or does continue to bring some traffic you believe to be valuable?
Because (and this is just my opinion) if you are convinced you cannot or will not (for whatever reason) work to clean the mess up, you'd be better off completely killing off Site A.
If you don't even if you migrate site B to a different server, the links still exist. The footprint remains.
-
RE: Www or not www base url
ouch. If I understand what you just communicated, you've got a site that had previously worked with the www version, but now, due to technical changes, the www version doesn't function? And won't for a couple months?
That's a scary scenario and having worked with many different developers and systems administrators over many years, I've never allowed one to tell me "we can't fix that for a couple months" and get away with that claim.
It can either be addressed or it can't at the site level or the server level, one way or another. Regardless of development framework, you should be able to set the non-www to redirect to the www version at the server level and it should work right. If there's a massive bug in the Magento implementation, that sounds like a very serious flaw in the developer's skill set as far as I can tell.
So - IF you're stuck, you're going to have a major SEO problem for longer than a couple months.
By all rights the only solution in that scenario is to scrap the www version altogether and NOT revert back to it in a couple months. Change all the 301 redirect settings and within GWT to now point to the non www version. Then work to build up more links to the non www version over time.
Because that's the only short-term solution you can do now from a best practices approach if the failure can't be quickly addressed.
And down the road, if you do this, you'd have to once again reverse everything, just causing you more problems.
So either get a developer / IT specialist who can fix it immediately, or scrap the www version altogether.
-
RE: When Is It Good To Redirect Pages on Your Site to Another Page?
If you have no long-term value you can identify for the page, yes, you can 301 redirect it if you want. So that's the question - is there long-term value?
If so, then I'd keep it and work on building it up.
-
RE: How long does it take for an article or a page to be listed by google
Diane,
a sitemap.xml file should include links to every page on the site you want indexed. While Google and Bing are fairly good at discovering content, this helps ensure they find pages sooner than their crawler might get around to discovering them. (unless you have a site with more than 10,000 URLS - at which point you should consider splitting sitemap files into multiple files and including a separate sitemap index file that you then submit. )
That then leads to the next question - how often? Every site is different and crawled at a different frequency based on Google's assessment of how often it should happen as well as factoring in that their system can only crawl so many pages on any given day.
That alone is reason to include all your content in sitemap files - and automatically ping search engines each time the sitemap file is updated.
If you have enough "news quality" content, look into a separate news sitemap file as well. With the right footwork and leverage, you can then see if your news specific content can be indexed even faster, and included in the Google news system as well.
-
RE: ECommerce Product Meta Descriptions vs. Product Descriptions
You're welcome Gretchen - would love to hear how it turns out in the end...
-
RE: Tips on building buzz and getting traffic for new sites
The best suggestion I can offer beyond your initial thoughts (all strong by the way) is to start in "Beta" mode (or maybe even "Alpha" mode), and make it clear that this is the case so people know it's very early in the life-cycle and thus will have much lower expectations regarding user / usage volume.
And definitely find a way to partner with someone who has very strong marketing skills. Without high quality marketing from someone who's "been there/done that", and you could very well be setting yourself up for a big fail. There's no guarantee they'll help you succeed, or that early adopters will catch the buzz wave, however it's more likely they will.
-
RE: ECommerce Product Meta Descriptions vs. Product Descriptions
Are your product descriptions well written? Do they accommodate "cut and paste" use within Meta Description fields? Remember there's a character limit to Meta Descriptions - too short and searchers may not be enticed to click. Too long and you leave it to Google to decide where to cut off...
Otherwise, the concept is a sound one, since the idea is you've said "product copy is SEO rich".
Then again what does THAT mean? A description that is too "spammy" looking may be a deterrent to a click.
Best course of action is to have human review on that whole policy.
And if you want to automate as much as possible, have the first portion of the product description the exact content you want in the meta description field. Standardize it. Make it a policy that writers need to keep that concept in mind.
-
RE: How do you know if you have been penalized by search engines?
Penalties can happen for a lot of reasons these days. One of the quickest ways is to see if you can associate a specific, clear drop in organic site visits with a known date of one of Google's many publicized updates. (either a drop on an exact same day as an update or within a day or two after).
Examples of clearly identifiable drops related to specific Google updates can be seen here:
http://searchmarketingwisdom.com/wp-content/uploads/GoogleSurvivorTipsSlide3.png
You can compare your own analytics with the known dates tracked here on SEOmoz
However if your site was penalized due to a "cascading" effect, that could be more difficult to pin down. For example, if your site suffers from a combination of "red flag" issues, it could be that you were on the verge of penalization for quite some time, and only an accumulation experience triggered it. (the last straw to break the rankings).
Another consideration is if you have seen a big drop recently, you could very well bounce back in short order if it was just an accumulation of "perfect storm" factors (including changes Google made that they then shortly thereafter "fix", for example).
-
RE: Very Weird Type of Penguin Penalization
I first saw this particular type of penalty as long ago as late 2010. As soon as the bad links were cleaned up, got them ranking again, but only because they had enough other positive and high quality signals.
Nowadays, it's not so easy to know if removing those links would be enough, and honestly, there could be other variables at play related to it that were the triggers and in fact may have been penguin or just as likely the other anchor text change google made in April (in the list of 52 ? 53? other changes for the month...
-
RE: Can RSS Title tags be optimized?
Glad I could offer some clarity from an SEO perspective.
-
RE: Proper structure for site with multiple catagories of same products
a 301 redirect is a server level or site level command to a web browser to jump to the page the redirect is pointing to. A canonical tag within a page is only a signal to a search engine to not count / index this page, but count/index the page in the canonical tag.
-
RE: How long does it take for Google to index a new site and has anyone experienced serious fluctuations in SERP within 2 weeks after launch?
I have to agree with GIGS20 on the manual submission (obviously only to engines you care about). Why? because in my own tests, I have consistently been able to get new sites ranked faster via sitemap direct submission than waiting for crawlers.
As far as the whole fluctuation thing - think about it this way - there are multiple algorithms in the Google system, it's not just one algorithm. And every site that gets an evaluation for it's on-site merits alone, then needs to have that evaluation held up in relation to other signals (off-site links and mentions, off-site social signals, etc.) and then all of that has to then be weighed against every other site that their system determines might be a topical focus match.
When a site is new, there's not a lot to go on so every time Google churns another update, every time they run another algorithm, things will likely change for some sites.
Then if you throw in the fact that many other sites are also being changed, worked on, further optimized (or hurt) every day in that topical focus, and the end result is an even more unstable ranking situation for new sites, especially when those are in highly competitive markets.
-
RE: Any SEO suggestions for my site?
Whenever I see a Magento implementation I immediately refer people to Yoast's write-up on the subject. He's about the best you can find from an expertise perspective on SEO and Magento. And to get even more goodness, the Magento Forums are a place to spend some quality time and Magento specific SEO discussions.
-
RE: Proper structure for site with multiple catagories of same products
I get this same issue a lot - just about or nearly every time I'm hired to perform a forensic audit on an ecommerce site...
Here's how I responded in one of my recent audits to this question:
Search engines struggle to then determine “which of these two nearly identical pages is the original source, which is more authoritative, and which is merely an attempt to own two positions in search results for the same company.
Sometimes search engines overcome that struggle in a positive way, other times their automated systems fail miserably. More often than not, on an initial look, you don’t even realize how much of a problem it is if you think you’re doing well in your organic search based visits.
In reality, every page that competes with every other page results in a cannibalization effect. Every page suffers, at least a little, and cumulatively, entire sites suffer way more than you might even comprehend.
Solutions for consideration:
-
Keep all copies of each product but make them unique. If they are kept, every version or instance of a product needs to have its content completely re-written so that it is truly unique compared to every other instance.
-
Keep all copies of each product but decide which ones you want the search engines to find and rank - every other version should be blocked from indexing. Do not rely on Google to figure out which to keep and which to rank and which to not.
-
Eliminate as many copies as possible by considering consolidation of products detail pages while maintaining access to them from multiple categories. 301 Redirect all copies of every version of those product details pages except the primary one you intend to keep indexed and ranked in search engines.
There's a lot more to consider such as canonical implementation, however in addition to the issue with canonical you already described, the fact is that canonical tags are only signals. they are NOT directives, so that's relying on Google to figure it out.
-
-
RE: Best way to handle SEO error, linking from one site to another same IP
Redirecting one site to another when the two sites are not highly related doesn't help SEO in a major way. And if there is or was ever any real value to the directory, if people type that domain and end up on your site but don't get redirected to another directory (or at the very least a new custom page that explains why they were redirected and thus that new custom page at least has content related to the topic of that directory), it will be confusing to them and that in turn would not help SEO either.
Given your situation, if you set up such a page, don't communicate in that description the reason you shared here about why the site was created, because that's not necessary. Instead, just describe what the directory was focused on overall. And explain that you hope to one day resurrect the directory on this site, but in the mean time, hope visitors will find your site helpful...
(In other words, use marketing spin to make it sound all positive all around, while keeping topical focus on target to match the value of the 301 from an SEO perspective)
Did that make sense or did I just confuse you further?
-
RE: When is it good to use target="_blank"
Without performing your own tests, there's no 100% best answer for each specific situation, market or site. And you'll find people even here in the Moz community who prefer remaining in the same browser window and just as many who don't.
So... all I can offer is my own experience in UX work - I've found users have an expectation that when they're clicking on a link internal to a site that they remain within the same browser window, but that when they're clicking on any link out (to another web site or social site), that opens a new window.
This is especially important however, when the destination they're going to off-site breaks browser native "back" buttons, where even if you want to go back to the site you came from, you can't.
-
RE: Non US site pages indexed in US Google search
John,
Thanks for adding all of these great suggestions - I don't do international that often so the full list of methods isn't always in my conscious awareness!
-
RE: 3 Products & 50 Options each, How does Google handle product variant or options?
Best practices SEO recommendations would dictate eliminating the unique URLs and consolidating to eliminate duplicate content conflict considerations. Never trust Google to "figure it out" when you can instead STRENGTHEN the core product content depth through consolidation.
To keep product pages clean, you can display product variations through CSS enabled tabs, or a host of other methods. (Dropdown menus, for example).
Then, redirect all those variation pages with 301 Redirects back to the new consolidated page version for each core product.
-
RE: Should I be using use rel=author in this case?
Check out Google's Source Attribution tag - that's the way I'd go in this situation rather than author, unless the "scraped" copy changes author info, which is a whole different issue..
-
RE: Can RSS Title tags be optimized?
Your RSS title should match the titles of the content displayed in your web site's on-site view. While it's possible to have a programmer write a custom script to generate alternate titles, it sounds like you would want to have unique titles for every city you want to rank for, even if the content the RSS is delivering was not originally written based on that. So if my understanding is correct, that could cause problems and your site could be red-flagged for that.
-
RE: Getting a link from a blog on every page querying what OSE is suggesting
More important than Moz data is what the impact would be of getting thousands of links from one domain from an SEO perspective.
When you get a link to a site, the "natural" type would be one link, or a couple, unless all the links are coming from a site where you are an author and those links are sometimes within content but always within author bio boxes. (and those are only links to an author's site, NOT using any type of SEO keywords not specific to the author, if even THEY are going to have real value).
Getting hundreds, or worse, thousands of links from one source is typically not otherwise "natural". It instantly gives the appearance of being bought/paid for.
While some sites can get away with that under some circumstances, if you look at it from an SEO best practices perspective, it's dangerous, now more than ever. So I would SERIOUSLY consider recommending that your client turn such an offer down. The risk is probably not worth the total value even if you did get such high weight based on Moz reports.
-
RE: Is it ok to use both 301 redirect and rel="canonical' at the same time?
Theory: Google is pretty good at figuring things out.
Reality: Google's algorithms, that go through hundreds of changes, tweaks and modifications every year are a soupy mess and their ability to "figure things out" was proven so flawed last year that along with Microsoft and Yahoo, they came up with Schema.org just to address PART of that reality.
Recommendation: Never do anything that could possibly confuse Google if you don't absolutely have to.
-
RE: Keyword Targeting Best Practices??
"Lightly" seeding alternate phrases into a page's main content area where those are TIGHTLY related to the existing page's topical focus can often help if you do so by writing natural paragraph based content that includes those.
Also, vary up the anchor text that links into those pages, both from content areas of other pages on your site (not in navigation, and please, NEVER in footers), as well as from off-site inbound link sources.
Always keep the topical focus and link source to link destination highly relevant and trustworthy.
And don't try to go for one-to-one parity on forcing new links to have to match every single phrase or partial variation of a phrase you've seeded the page with. Let it unfold more "randomly".
By taking those steps alone, I have helped sites increase total volume of keyword phrases used to discover client sites by leaps and bounds because Google is really good (some of the time) at recognizing the broader context of a page when you use those methods.
-
RE: Trackbacks vs Links: What's the Difference?
Michelleh - while you got insight into what trackbacks are, I can offer this regarding why GA isn't showing some of those links.
Unfortunately, no analytics system is 100% accurate or complete, not even Google's. It's possible that if they're new or recent, they will eventually show up in GA reports, however it's not guaranteed.
That does NOT mean you don't get SEO linking value for them, only that reporting systems are imperfect AND Google's intent is to show " a sampling" even though they don't make that abundantly clear in a way most people would even know that's their stance.
If you want to get more insight into Google's Sampling methods, you can read about theme in the Google Developer site here.
-
RE: Where to put Schema On Page
Always place schema markup directly in the position on the page where you want the content to appear if it's content specific - wrapping it around that content. So if your business name and address are in the main content area, that's where you place the schema code. It's literally a wrapper just like a CSS div would be, or an old-school HTML table, but not for display purposes on-site.
EDITED 11/14/2013 based on a question from Oliver (below) regarding situations where markup is located in the "head" area of the page:
Exceptions to "in-body" markup:
As is the case with any structured markup solution, there will, from time to time, be cases where certain, specific elements go in the "head" section of the code. Anything that applies to an individual page in its entirety, and does not limit itself to an element of content within the page does, in fact, belong in the "head" area of the page code.
-
RE: External links incorrect
I have also seen the Moz system need time to find links after I added a new site to a campaign recently however also be aware that as vast as SEOmoz's data set is, it's not the entire web.
And in reality, no single link reporting solution in existence currently (even Google's) is going to show you all of the links that might be pointing to your site.
-
RE: Canonical Tag for Ecommerce Site
Okay so to be sure, you simply set up canonical tags to point to your newly identified "proper" URL for each product, correct?
If so, given the lapse in time between the change and the drop, I would need to assume something else has happened. Some other factor would need to be the cause, if your canonical implementation was executed properly and there's not a major flaw at the code level in the results.
While there is a slight chance it's tied to the canonical change even if that was done properly, I'd definitely look at other factors as well.
-
RE: Non US site pages indexed in US Google search
Its absolutely possible that's what's happening. You cannot rely on Google's system being barred from crawling anything on your site, no matter how well you code it. Even if you blocked the URL with nofollow, it would not stop the bot.
Another factor is if all your content is in English (as your URL structure suggests it is). Google does a terrible job of discerning separation of international content when all the content is in the same language, on the same root domain.
Proper separation in a way Google can't confuse is vital. Since I expect you do not intend to change the language across sites, your best action would be to migrate international content to a completely different domain. At the very least you can then use GWT to inform Google that "this domain is for this country", however if you want to be even better off, you'd host that other content on a server in that country.
-
RE: Canonical Tag for Ecommerce Site
a canonical tag on every product page? Pointing to a different page or pointing to themselves? And what was the reason for doing so?
-
RE: Genuine Reciprocal Google Places Reviews, is that OK?
Google is less clear most likely because they're still mostly stuck in the belief that they shouldn't reveal clarity and expect site owners to figure it out. Which inevitably leads, every year, to more and more "what used to be acceptable isn't" complaints.
Except some tactics never were acceptable and Google's just now getting around to addressing some that they previously never considered or never got a chance to.
My latest effort is all about "does this look natural". That of course, is then filtered through "does this look natural as Google views things in their algorithmic attempt to emulate a human's perspective.
-
RE: Re-Direct
I've read countless people's take on the "will the bad follow the 301" issue and need to say I've seen about a 50/50 split. I have yet to see an actual case study on it though.
The fact that the existing site is 12 years old is also something with varying views - the only real benefit for age is for a site that gets 301 redirects with no content change on a page-to-page level (UX design can change without impacting that). If you were to 301 and change content, any age value would be lost as soon as Google reevaluated the new content based on current factors.
So the issue does come down to whether to 301 or not. Hopefully someone has done a real case study and will at some point blog about it or provide a link to it if it's already out there.
-
RE: Genuine Reciprocal Google Places Reviews, is that OK?
yeah it's annoying that Yelp specifically states it's against their TOS to actively solicit reviews this way, yet they are perfectly happy if you display their "We Yelp" stickers all over the place.
And oddly, places like the BBB's stand-alone "Trust-Link" reviews site is the exact opposite. They encourage business owners to seek reviews.
-
RE: Page Extension for SEO Post Penguin
First, and most important, be very careful about pointing such a refined niche phrase at the home page in an intentional manner.
Your site is not, in its entirety, all about Brisbane flower delivery. So while it's natural to have a handful of links pointing to the home page with that or a variation in the link, it's definitely NOT natural to have a lot of such links.
Too many people in SEO only get half the concepts then run with them. This is why so many sites are routinely slapped, devalued or otherwise penalized by Google over the long haul.
Always use the "is this natural looking" lens. Most people who do not have any idea what SEO is would link a Brisbane specific link directly to a page on a site where that specific page is about Brisbane.
And since 2010, I have seen sites devalued for specific keyword phrases that were too refined to be relevant when pointing to the home page when there were too many similar or identical links. So please - be very cautious with that aspect of your work.
As for the page name, whether it's SEO or nothing to do with SEO, it's perfectly valid information retrieval logic to name an individual page this way, because that's a great way to identify that "this page is really about delivery of flowers within Brisbane", though a slightly better use would be "Brisbane-Flower-Delivery" or "Flower-Delivery-Brisbane".
Penguin should not have (and I'm HIGHLY confident it did not) target properly named page URLs.
-
RE: Genuine Reciprocal Google Places Reviews, is that OK?
The danger of reciprocal reviews is being flagged because they're potentially unnatural. So if there's a one to one parity (every business that reviews another business gets a review by every one of those businesses), that's a serious concern to avoid happening. Same goes for reciprocal reviews that are always the same (4 stars each way, for example). Too easily spotted as suspicious.
Also, if there's a concerted effort and "conspiracy" to get reviews generated (a bunch of companies join a pool of companies to "agree to review each other"), that could lead to unnatural results. So it's a very cautious process to even consider.
The other issue is - if a business that participates only or mostly only reviews other businesses in that group, that's highly suspect. Reviews should be spread out across a wide swath of other businesses NOT in the group, and every participant would need to have their own set of reviews to other outside businesses so no unnatural pattern emerges.
Other than that, it's perfectly valid to review other businesses when you've genuinely done business with them.
It's also perfectly valid for an official business account to review other businesses, since they're business to business transactions. And thus, no need to have a separate account just for the sake of reviews. (All reviews should be from an account that has a holistic profile regarding the activity on the account. It shouldn't be mostly, or all reviews and no other activity).
-
RE: Backlink focus?
Always think "is this natural looking?" Meaning step out of your SEO perspective and think about a site owner who's never heard of SEO.
In a natural web, links exist that point to all sorts of pages, not all home page focused. Why? Because if you've got a page two or three clicks deep with really great information that I want to share, I would link directly to it. Because why would I want to force people I'm sharing it with to go to your home page and maybe or maybe not find it? Forcing through the home page is definitely NOT natural.
Based on that very specific notion, search engines reward a site with more value that has links pointing to a diversity of pages within the site than they do to a site with links ONLY pointing to the home page.
Of course SEO is extremely complex with hundreds of factors, so some sites MIGHT rank even if all links point to their home page. But that's where you can actually get a competitive advantage, and leap-frog ahead of them if you follow the "what's natural" approach. You can get higher rankings with less pages and less links going the "what's natural" way.
Do NOT, however, ignore all the other key SEO factors (link quality, high relevance of the page that has the link in it to the page it's linking to, and all the rest of the factors that need to be considered related to link building).
-
RE: Subdomains vs. Subfolders Wordpress Multisite
While it's true that in the overwhelming majority of situations, sub folders are the best solution, I'm going to say that purely from the very limited information shared so far, having sub-domains is far better than having full-blown individual domains, and though not necessarily as good as sub-folders, its still better than the current domain model you have.
It needs to be executed REALLY WELL - with extremely careful thought and consideration given to navigation and cross-domain linking. However, simply by having subdomains, you instantly let every prospective visitor understand they're all part of the same root domain. That alone boosts your trustworthiness in a BIG way. And Google does a fair job now at understanding (and in turn providing SOME ranking value) to the root domain from subdomains.
Just don't link to every other subdomain from every other one. Because that will instantly KILL your SEO.
-
RE: Javascript, PhP and SEO Impact?
JavaScript is one of several technologies that offers severe limitations in search engines and their ability to properly see content, then just as important but often overlooked, properly and cleanly evaluate that content from an SEO perspective.
Specific considerations:
- Google does a "fair" job at discovering content passed through JavaScript (either on-page or at the code level)
- A "fair" job means it's hit and miss as to whether their system can actually find that content
- Whatever content the Google system CAN find via JavaScript is NOT necessarily able to be used to properly evaluate content intent, focus or relationship to other content
So - the best practices recommendation is if you want/need content to be found and properly evaluated by Google (or Bing) do NOT pass it through JavaScript.
And also, if you want to HIDE content from Google, don't assume you can successfully do so via JavaScript either.
As for PHP, its the most widely adopted and utilized web programming language out there. The language by itself is essentially SEO neutral. It's all in how a programmer utilized PHP that matters. In the hands of a programmer that either truly understands SEO or collaborates closely with an SEO expert (who also understands the limitations/pitfalls that can arise with "bad" (SEO-unfriendly) PHP coding, it's a great language.
-
RE: Can changing a host provider impact search rankings?
A quick way to check is reverse IP lookup. I routinely use the "You Get Signal" free checker. Just from domain names you can sometimes see right away if there's a lot of garbage. You can then just click through to sites on that server and see what they look like.
That's only possible if you know the IP your site is going to be on though, and doesn't show you all the sites on a shared server, or within a C-block.
Consider the host as well. Some hosts blatantly tout their SEO value beyond typical hosting provider marketing spin for SEO as a minor service.
So for example just do a search for SEO Host. You can pretty much just stay away from every site in that search result.
If you go with a top tier well known hosting provider, you should be fine. But if you do, as soon as the hosting is set up, run the reverse IP lookup. If things look suspicious, immediately contact them and request a change to a different server or C-block in their system.
-
RE: Can changing a host provider impact search rankings?
retaining full link equity only happens if you keep all the content. If you dump it and start from scratch, you may retain it for a period of time, however over time Google's system is going to re-evaluate everything and you will likely lose a lot of it.
Given current Google anti-spam intent I would also caution that you could very well send up major red flags if you do a mass replacement of all content.