Site: Query Question
-
Hi All,
Question around the site: query you can execute on Google for example. Now I know it has lots of inaccuracies, but I like to keep a high level sight of it over time.
I was using it to also try and get a high level view of how many product pages were indexed vs. the total number of pages.
What is interesting is when I do a site: query for say www.newark.com I get ~748,000 results returned.
When I do a query for www.newark.com "/dp/" I get ~845,000 results returned.
Either I am doing something stupid or these numbers are completely backwards?
Any thoughts?
Thanks,
Ben
-
Barry Schwartz posted some great information about this in November of 2010, quoting a couple of different Google sources. In short, more specific queries can cause Google to dig deeper and give more accurate estimates.
-
Yup. get rid of parameter laden urls and its easy enough. If they hang around the index for a few months before disappearing thats no big deal, as long as you have done the right thing it will work out fine
Also your not interested in the chaff, just the bits you want to make sure are indexed. So make sure thise are in sensibly titled sitemaps and its fine (used this on sites with 50 million and 100 million product pages. It gets a bit more complex at that number, but the underlying principle is the same)
-
But then on a big site (talking 4m+ products) its usually the case that you have URL's indexed that wouldn't be generated in a sitemap because they include additional parameters.
Ideally of course you rid the index of parameter filled URL's but its pretty tough to do that.
-
Best bet is to make sure all your urls are in your sitemap and then you get an exact count.
Ive found it handy to use multiple sitempas for each subfolder i.e. /news/ or /profiles/ to be able to quickly see exactly what % of urls are indexed from each section of my site. This is super helpful in finding errors in a specific section or when you are working on indexing of a certain type of page
S
-
What I've found the reason for this comes down to how the Google system works. Case in point, a client site I have with 25,000 actual pages. They have mass duplicate content issues. When I do a generic site: with the domain, Google shows 50-60,000 pages. If I do an inurl: with a specific URL param, I either get 500,000 or over a million.
Though that's not your exact situation, it can help explain what's happening.
Essentially, if you do a normal site: Google will try its best to provide the content within the site that it shows the world based on "most relevant" content. When you do a refined check, it's naturally going to look for the content that really is most relevant - closest match to that actual parameter.
So if you're seeing more results with the refined process, it means that on any given day, at any given time, when someone does a general search, the Google system will filter out a lot of content that isn't seen as highly valuable for that particular search. So all those extra pages that come up in your refined check - many of them are most likely then evaluated as less than highly valuable / high quality or relevant to most searches.
Even if many are great pages, their system has multiple algorithms that have to be run to assign value. What you are seeing is those processes struggling to sort it all out.
-
about 839,000 results.
-
Different data center perhaps - what about if you add in the "dp" query to the string?
-
I actually see 'about 897,000 results' for the search 'site:www.newark.com'.
-
Thanks Adrian,
I understand those areas of inaccuracy, but I didn't expect to see a refined search produce more results than the original search. That just seems a little bizarre to me, which is why I was wondering if there was a clear explanation or if I was executing my query incorrectly.
Ben
-
This is an expected 'oddity' of the site: operator. Here is a video of Matt Cutts explaining the imprecise nature of the site: operator.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Avg Page Load Time (sec) Comppared to site average - what does it mean?
Hi All, In google analytic In Site Speed -> Page Timings we have two columns a) Page Views & b) Avg Load Time (sec) compared to site average. Now in "b" column I am able to below % one in green and another in brown so what does it mean? Can anyone please explain me? Image attached Thanks! bNbBA
Reporting & Analytics | | amu1230 -
Should Google Trends Match Organic Traffic to My Site?
When looking at Google Trends and my Organic Traffic (using GA) as percentages of their total yearly values I have a correlation of .47. This correlation doesn't seem right when you consider that Google Trends (which is showing relative search traffic data) should match up pretty strongly to your Organic Traffic. Any thoughts on what might be going on? Why isn't Google Trends correlating with Organic Traffic? Shouldn't they be pulling from the same data set? Thanks, Jacob
Reporting & Analytics | | jacob.young.cricut0 -
Are these Search Console crawl errors a major concern to new client site?
We recently (4/1) went live with a new site for a client of ours. The client site was originally Point2 before they made the switch to a template site with Real Estate Webmasters. Now when I look into the Search Console I am getting the following Crawl Errors: 111 Server Errors (photos) 104 Soft 404s (blogs, archives, tags) 6,229 Not Found (listings) I have a few questions. The server errors I know not a lot about so I generally ignore. My main concerns are the 404s and not found. The 404s are mostly tags and blog archives which I wonder if I should leave alone or do 301s for each to /blog. For not found, these are all the previous listings from the IDX. My assumption is these will naturally fall away after some time, as the new ones have already indexed. But I wonder what I should be doing here and which will be affecting me. When we launched the new site there was a large spike in clicks ( 250% increase) which has now tapered off to an average of ~85 clicks versus ~160 at time of launch. Not sure if the Crawl Errors have any effect, I'm guessing not so much right now. I'd appreciate your insights Mozzers!
Reporting & Analytics | | localwork0 -
High Bounce Rate on traffic generating area of our site
Hi, Our eCommerce site currently includes a blog section known as Igloo which we have filled with unique and helpful content that is useful to a fair few people, not just customers of ours. It currently attracts a large number of visitors (more than the actual eCommerce side of the site in actual fact) organically who aren't currently customers of ours. Very few of these turn in to paying clients so it's not really a money spinner but it has worked quite well from a linkbait perspective / traffic generation perspective and undoubtedly a few of these people do end up making a purchase on the actual shopping end of our site. We're look at ways to encourage these people finding help on this free resource to take a look at our homepage and hopefully make an order but in the meantime I am worried that there may be a few downsides to us creating this content: Google may see us more as a help site than a shopping site. Since selling products is where we make our money this could ultimately be a bad thing. Our bounce rate is REALLY high (I'm talking around 94%) on the help site versus around 20% on the eCommerce site. I guess people land on the article they want, read it and then disappear. Would this bounce rate skew our entire site stats and ultimately result in decreased performance in the SERPS. I would appreciate your opinions and, in the event you do feel it may be hurting us overall perhaps some suggestions on how to mitigate the effects? Many thanks!
Reporting & Analytics | | ChrisHolgate0 -
Why would our client's site be receiving significant levels of organic traffic for a keyword they do not rank for?
We have a client that has received 100+ organic visits for the keyword 'airport transfers', yet the site does not rank in the top 100 search results for this keyword. We have checked that it is not untagged PPC traffic. Truly baffling. Can anybody help?
Reporting & Analytics | | mrdavidingram0 -
Why am i getting a flux of increase in Impressions on my site & then it decreases
They guys. Hope everyone is having a great week. I wanted to get some inputs from you guys in regards to what is happening to my site that i quite don't understand. Every month or so i get this influx of high visibility with impressions for my keywords and then the impressions go away but my rankings still keep going up. Has anyone experienced this before and can give me some insight on what is going . Why do i get such a big jump and then it dies off only to return again a month later or 2 months later. I know you guys want probably some info from my site or from analytics or webmaster tools so i will provide as much as i can . For now i have included a screen shot. ScreenShot2013-06-04at31220PM_zps0d02f5fc.png ScreenShot2013-06-04at31134PM_zps5bb81b68.png ScreenShot2013-06-04at31134PM_zps5bb81b68.png ScreenShot2013-06-04at31220PM_zps0d02f5fc.png
Reporting & Analytics | | BizDetox0 -
Our SEO is garbage. can someone answer a few questions for me?
I've seen our SEO drop to more or less the bottom of the barrel, and I don't have any answers yet as to why. SEOMoz is running it's crawl currently, and I have a few errors about duplicate titles and content, so I know I have some work cut out for me. But, why the sudden drop? The only change I made at the time was a change to our URL structure, but all links were 301'd to their new location. Does this still hurt SEO that terribly? Also, our robots.txt file is getting indexed and showing up as the first result at times. Very embarrassing. It's doing better than our other pages. 😞 I don't get what's happening here. Yrzu3,fkbYu
Reporting & Analytics | | stagl0 -
Setting up Analytics on a Site that Uses Frames For Some Content
I work with a real estate agent and he uses strings from another tool to populate the listings on his site. In an attempt to be able to track traffic to both the framed pages and the non-framed pages he has two sets of analytics code on his site - one inside the frame and one for the regular part of the site. (there's also a third that the company who hosts his site and provides all these other tools put on his site - but I don't think that's really important to this conversation). Not only is it confusing looking at the analytics data, his bounce rate is down right unmanageable. As soon as anyone clicks on any of the listings they've bounced away. Here's a page - all of those listings below " Here are the most recent Toronto Beaches Real Estate Listings" are part of a frame. http://eastendtorontohomes.com/toronto-beach-real-estate-search/ I'm not really sure what to do about it or how to deal with it? Anyone out there got any good advice? And just in case you're wondering there aren't any other options - apart from spending thousands to build his own database thingie. We've thought about that (as other agents in the city have done that), but just aren't sure it's worth it. And, quite frankly he doesn't want to spend the money.
Reporting & Analytics | | annasus0