Desktop vs. Mobile Results
-
When googling on www.google.ca for "wedding invitations" and in my own geo location market of Toronto, my site - www.stephita.com, will show up differently on SERP on desktop (Chrome & IE) vs. mobile (iPad, iPhone, android, etc.).
On desktop SERP, I will show up 6/7 position... (which is relatively a new position, the past 3 weeks - I was previously on page 2) (After a bunch of SEO fixes, I've managed to propel my site back to page 1!)
On mobile SERP, I only show up on 1/2 position on PAGE 2
As I mentioned above, I did a bunch of SEO fixes that I think were related to Panda/Penguin algos. So I'm wondering why my MOBILE SERP has NOT improved along the way? What should I be looking at to fix this 5-6 position differential?
Thanks all!
-
So I used the Google PageSpeed Insights to get a better idea:
I'm somewhat technically saavy, but I can't seem to wrap my brain on HOW TO Enable Compression fix???
Do you happen to have a "simple example" of what needs to be done? I understand the concept of having a compressed file will save on bandwidth etc... But am I literally "gzipping" the files like "jquery.js"... but how does the HTML code work etc.. Any straight forward example you could possibly show me?
Same with ENABLING CACHE... My site is done w/ PHP, so i thought sending a simple HEADER command like so:
header("Cache-Control: max-age=2592000");
would suffice... but the pagespeed insight still says,the following elements: which are jpgs, css files, js files that my "www.stephita.com" index file references... Am I doing the Cache statement correct, by putting it on the INDEX.PHP page? Or do I somehow have to literally have to reference each jpg/css/js file that the PageSpeed insight is saying needs something done? Again, examples?
I appreciate any help on this matter
-
Tyson, Google's mobile and desktop algorithms are different--so that's why you're seeing different results. As Ruben mentioned, there may be mobile issues that you can fix or optimize for, which will help your site rank better in the mobile search results.
I took a look at the results for your site in Google's Mobile test https://www.google.com/webmasters/tools/mobile-friendly/ and it's mobile friendly, but the WebPageTest.org results show that it definitely needs to be cached--you should set up a CDN such as CloudFlare in order to cache the pages on the site.
-
One major reason might be your mobile score according to pagespeed insights. When I put your site in, your mobile page was a 50/100. Your desktop was 64/100 - which is not good - but still better than 50.
If you have the capability - or can hire someone who does - implement the suggestions on pagespeed insights. Enable compression, leverage browswer caching, etc, and I bet that will help close the gap, at least some.
Best,
Ruben
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Structured data: Product vs auto rental schema?
Hi mozzers, If you are rental company, is it useful to add both the product and auto rental schemas or auto rental schema on its own should just be enough? Finally, on the auto rental schema, you have to add an address. Could we just add a city instead of an entire address and avoid receiving a warning message on the strutured data testing tool? Thank you.
Intermediate & Advanced SEO | | Ty19860 -
Silo Architecture and Mobile First
This goes to the age-old SEO argument - how many links in the navigation. We are a well-known brick and mortar brand We have 20,000 SKUs and over 500 categories and sub-catetgories. 95%+ of our backlinks go to the home page. We don't have a blog, but it's in the works. Our site is not responsive. It serves up different versions based on device type, but is not an "M Dot". Our rankings are pretty strong in spite of a large number of technical SEO issues (different discussion). Currently, our e-commerce desktop site is "Siloed" (I'm new to the company - I didn't do it). The home page links via the top nav to categories. The category pages link to subcategories via sidebar navigation, or via images on the category pages (instead of product images). It's pretty close to textbook silos, and it's very near how I would have designed it. This silo architecture passes the most link juice to our categories which target our highest search volume (head) terms. The categories pass link juice (albeit significantly less) to our subcats which target secondary terms. In terms of search volume and commercial value, our tiers line up very neatly. On average, the targeted subcat terms get about 1/6 of the volume of our head terms. The Silo concept has been around forever, and is evangelized by Bruce Clay and other respected SEOs. Every time I've siloed an ecommerce site, the rankings improve dramatically, so who am I to argue? So, what's the problem? Read on... Our mobile navigation, on the other hand, links to every category and subcategory via flyout navigation (I didn't do this, either). In theory, this distributes an equal amount of link juice to all categories and subcategories. It robs link juice from our categories and passes it to subcategories. Right now, this isn't a problem. Rankings are based on the desktop site, and minor adjustments are made for mobile rankings. When Mobile First rolls out, our mobile nav will be the default navigation for Google, and in theory, link juice distribution across the site will change radically, and potentially harm our rankings for our head terms. I always study site architecture for a number of respected ecommerce sites. Target and Walmart, for example, link to every category and subcategory through their mobile and desktop navigation. Wayfair takes a silo approach on mobile and desktop, linking in tiers. I would argue that Walmart and Target have so much DA/TF/CF that they don't give a damn about targeted link juice distribution - it's all about UX. Wayfair's backlink profile is strong, but it's not Walmart or Target, so they need to be concerned about link juice distribution - hence the silo approach. Have the Google spokespeople said anything about this? I see this as a potential landmine across the industry. Is this something I should be concerned about? Has anyone had any experience with de-siloing a website? Am I making a big deal out of a non-issue? Please - no arguments about usability. UX is absolutely part of the equation. Usability is a ranking factor, but if our rankings and traffic take a nose dive, UX isn't going to matter. This is a theoretical discussion discussion on link juice distribution, and I know that compromises need to be made between SEO and UX.
Intermediate & Advanced SEO | | Satans_Apprentice0 -
Country specific results
Our country specific pages reside as a subfolder under the main domain. So for example in US it's /us/, in Canada it's /ca/. What we've noticed is that Google Canada is showing US pages in some of the search results. Does anyone have experience with how to direct Google to display country specific page results?
Intermediate & Advanced SEO | | kxu0 -
Customer Experience vs Search Result Optimisation
Yes, I know customer experience is king, however, I have a dilema, my site has been live since June 2013 & we get good feedback on site design & easy to follow navigation, however, our rankings arent as good as they could be? For example, the following 2 pages share v similar URLs, but the pages do 2 different jobs & when you get to the site that is easy to see, but my largest Keyword "Over 50 Life Insurance" becomes difficult to target as google sees both pages and splits the results, so I think i must be losing ranking positions? http://www.over50choices.co.uk/Funeral-Planning/Over-50-Life-Insurance.aspx http://www.over50choices.co.uk/Funeral-Planning/Over-50-Life-Insurance/Compare-Over-50s-Life-Insurance.aspx The first page explains the product(s) and the 2nd is the Quote & Compare page, which generates the income. I am currently playing with meta tags, but as yet havent found the right combination! Originally the 2nd page meta tags were focussing on "compare over 50s life insurance" but google still sees "over 50 life insurance" in this phrase, so the results get split. I also had internal anchor text supporting this. What do you think is the best strategy for optimising both pages? Thanks Ash
Intermediate & Advanced SEO | | AshShep10 -
Why is Google Displaying this image in the search results?
Hi i'm looking at advice on how to remove or change a particular image Google is displaying in the search results. I have attached a screenshot. From the first look of it, i assumed the image would be related and be on the dealers Google+ Local Page: https://plus.google.com/118099386834104087122/about?hl=en But there are no photos. The image seems to be coming from the website. Is there a way to stop Google from displaying this image or making them display a totally different image. Thanks, Chris XzfsnUy.png
Intermediate & Advanced SEO | | Mattcarter080 -
Can I compete with these results? (Brand in Serp)
Hey, One quick question. Lets say im fighting for keyword "british airways" and i want to appear straight after first result in number 2 position. Is it possible to compete with stroked results. (See image attached) Thanks Stxct.png
Intermediate & Advanced SEO | | Marteen0 -
Is linking to search results bad for SEO?
If we have pages on our site that link to search results is that a bad thing? Should we set the links to "nofollow"?
Intermediate & Advanced SEO | | nicole.healthline0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0