Facebook likes for website OR FB profile page?
-
I am expecting answers/views from social SEO experts.
Should i encourage people to "like" my main page of my website OR FB profile page of website? Which one works better and why?
-
Gianluca said it very well, so all I'd like to add is:
If you're asking in terms of RANKINGS (which will help you rank better?) use likes on your website. And be sure the proper Open Graph meta tags are installed on your page. The amount of Facebook Likes and Shares from your website can improve rankings.
You can generate the meta code here.
-Dan
-
My answer could seem a "not answer": it depends.
Yes, it depends on the social media strategy you want to follow and if you really are looking to a deep integration of your site in the social graph.
In fact, to "like" your page means that you integrate the open graph codes into your site, in order to take advantages of all the feature Facebook offers: facepile, comments, sign-in, like/share andthe incoming action button (read, listen...) and Facebook Insight for Websites. If you are looking to a direct way to show you site into Facebook, the integration of the site with Open Graph is a must, because every action done by a logged in Facebook users is reflected in his timeline.
This has an advantage respect the Page, because this way an user is not "obliged" to pay attention to the status updates of the Page itself, that maybe he is not seeing because those same updates are not having the sufficient edgerank to appear in their streamline (and soon "ticker").
Anyway, to have a Facebook page for your site is useful too... always if you don't use it simply as a repository of links to your new blog posts or new products. If you plan a real marketing strategy with your page, that means contests, special offers, real dialogue with the fans, than you can create brand evangelists that can become your "not paid" commercial force.
See for instance what SEOmoz does with its Page. SEOmoz uses it not only to communicate the new blog posts, but also as a way to strengthen the Community bounds with photos, videos (sometimes) and active conversation (apart that they use Roger as brand ambassador).
To conclude: Site integration to the Open Graph and Facebook Page can coexist and complement one each other and have different reasons to be.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I want to update the title of a page on my website would that negatively impact SEO?
I want to update a few page titles on my website. Some are duplicate titles, and some titles are just too long. Will my website be negatively impacted at all if I update these? I read somewhere that once you have created a page you need to stick to the title you have given it. So I am not sure if I should leave these pages be and make note of utilizing better SEO practices for the future or if I can go back and edit them. Any insight is much appreciated!
Intermediate & Advanced SEO | | meredithrice0 -
How will canonicalizing an https page affect the SERP-ranked http version of that page?
Hey guys, Until recently, my site has been serving traffic over both http and https depending on the user request. Because I only want to serve traffic over https, I've begun redirecting http traffic to https. Reviewing my SEO performance in Moz, I see that for some search terms, an http page shows up on the SERP, and for other search terms, an https page shows. (There aren't really any duplicate pages, just the same pages being served on either http or https.) My question is about canonical tags in this context. Suppose I canonicalize the https version of a page which is already ranked on the SERP as http. Will the link juice from the SERP-ranked http version of that page immediately flow to the now-canonical https version? Will the https version of the page immediately replace the http version on the SERP, with the same ranking? Thank you for your time!
Intermediate & Advanced SEO | | JGRLLC0 -
New page not being picked up
Hello, We have created a new page for truck rentals but for some reason it does not seem to be picked up. See this report: http://screencast.com/t/npYqeoa5gq The page is: https://www.globecar.com/en/montreal-truck-rentals but our main site is being pickedup instead vs the competition that has their truck page showing up. Can anyone help me understand? Thanks, Karim
Intermediate & Advanced SEO | | GlobeCar1 -
Page Count in Webmaster Tools Index Status Versus Page Count in Webmaster Tools Sitemap
Greeting MOZ Community: I run www.nyc-officespace-leader.com, a real estate website in New York City. The page count in Google Webmaster Tools Index status for our site is 850. The page count in our Webmaster Tools Sitemap is 637. Why is there a discrepancy between the two? What does the Google Webmaster Tools Index represent? If we filed a removal request for pages we did not want indexed, will these pages still show in the Google Webmaster Tools page count despite the fact that they no longer display in search results? The number of pages displayed in our Google Webmaster Tools Index remains at about 850 despite the removal request. Before a site upgrade in June the number of URLs in the Google Webmaster Tools Index and Google Webmaster Site Map were almost the same. I am concerned that page bloat has something to do with a recent drop in ranking. Thanks everyone!! Alan
Intermediate & Advanced SEO | | Kingalan10 -
Consistent Ranking Jumps Page 1 to Page 5 for months - help needed
Hi guys and gals, I have a really tricky client who I just can't seem to gain consistency with in their SERP results. The keywords are competitive but what the main issue I have is the big page jumps that happen pretty much on a weekly basis. We go up and down 40 positions and this behaviour has been going on for nearly 6 months.
Intermediate & Advanced SEO | | Jon_bangonline
I felt it would resolve itself in time but it has not. The website is a large ecommerce website. Their link profile is OK in regards to several high quality newspaper publication links, majority brand related anchor texts and the link building we have engaged in has all been very good i.e. content relevant / high quality places. See below for some potential causes I think could be the reason: The on page SEO is good however the way their ecommerce website is setup they have formed a substantial amount of duplicate title tags. So in my opinion this is a potential cause. The previous web developer set-up 301 redirects all to their home page for any 404 errors. I know best practice is to go to the most relevant pages, however could this be a potential issue? We had some server connectivity issues show up in webmasters tools but that was for 1 day about 4 months ago. Since then no issues. they have quite a few 'blocked URLs' in their robots.txt file, e.g. Disallow: /login, Disallow: /checkout/ but to me these seem normal and not a big issue. We have seen a decrease over the last 12 months in Webmasters Tools of 'total indexed web pages' from 5000 to 2000 which is quite an odd statistic. Summary So all in all I am a tad stumped. We have some duplicate content issues in title tags, perhaps not following best practice in the 301 redirects but other than that I dont see any major on page issues, unless I am missing something in the seriousness of what I have listed.
Finally we have also do a bit of a cull of poor quality links, requesting links to be removed and also submitting a 'disavow' of some really bad links. We do not have a manual penalty though. Thoughts, feedback or comments VERY welcome.0 -
SEO structure question: Better to add similar (but distinct) content to multiple unique pages or make one unique page?
Not sure which approach would be more SEO ranking friendly? As we are a music store, we do instrument repairs on all instruments. Currently, I don't have much of any content about our repairs on our website... so I'm considering a couple different approaches of adding this content: Let's take Trumpet Repair for example: 1. I can auto write to the HTML body (say, at the end of the body) of our 20 Trumpets (each having their own page) we have for sale on our site, the verbiage of all repairs, services, rates, and other repair related detail. In my mind, the effect of this may be that: This added information does uniquely pertain to Trumpets only (excludes all other instrument repair info), which Google likes... but it would be duplicate Trumpet repair information over 20 pages.... which Google may not like? 2. Or I could auto write the repair details to the Trumpet's Category Page - either in the Body, Header, or Footer. This definitely reduces the redundancy of the repeating Trumpet repair info per Trumpet page, but it also reduces each Trumpet pages content depth... so I'm not sure which out weighs the other? 3. Write it to both category page & individual pages? Possibly valuable because the information is anchoring all around itself and supporting... or is that super duplication? 4. Of course, create a category dedicated to repairs then add a subcategory for each instrument and have the repair info there be completely unique to that page...- then in the body of each 20 Trumpets, tag an internal link to Trumpet Repair? Any suggestions greatly appreciated? Thanks, Kevin
Intermediate & Advanced SEO | | Kevin_McLeish0 -
I have removed over 2000+ pages but Google still says i have 3000+ pages indexed
Good Afternoon, I run a office equipment website called top4office.co.uk. My predecessor decided that he would make an exact copy of the content on our existing site top4office.com and place it on the top4office.co.uk domain which included over 2k of thin pages. Since coming in i have hired a copywriter who has rewritten all the important content and I have removed over 2k pages of thin pages. I have set up 301's and blocked the thin pages using robots.txt and then used Google's removal tool to remove the pages from the index which was successfully done. But, although they were removed and can now longer be found in Google, when i use site:top4office.co.uk i still have over 3k of indexed pages (Originally i had 3700). Does anyone have any ideas why this is happening and more importantly how i can fix it? Our ranking on this site is woeful in comparison to what it was in 2011. I have a deadline and was wondering how quickly, in your opinion, do you think all these changes will impact my SERPs rankings? Look forward to your responses!
Intermediate & Advanced SEO | | apogeecorp0 -
Why would the PageRank for all of our websites show the same?
The last time I checked (early this year), the PageRank on the sites I manage varied, with the highest showing as 6. It made sense as the PR6 site has loads of links and has been around for a long time, whereas the other sites hadn't. Now all of our websites are showing the same PageRank - 6, even one that has recently launched and another that has barely any links/traffic or anything to it. I didn't check the PR of that one last time (I'd be surprised if it was 2), but the sites now showing as 6 ranged from PR3 to PR6 back then. We changed server in February...so could this issue be something to do with all of the sites being stored on the same server? It doesn't seem right but it's the only thing I can think of. At the moment, the Domain Authority for these six websites ranges from 27 to 62.
Intermediate & Advanced SEO | | Alex-Harford0