This might be a silly question...
-
I have 14,000 pages on my website, but when I do a site:domain.com search on google, it shows around 55,000.
I first thought.."hmm, maybe it is including subdomains". So I tried site:www.domain.com and now it shows 35,000. That still is more than double the pages I have.
Any ideas why? When you filter a google search using "site", isn't it meant to pick up just that site's pages?
*P.S I tried using the SEOquake add-on to download search results as a CSV file to review, but the add-on only downloads the first 100 search results
-
Thanks, I'll look at manually specifying these parameters and see if they make an impact.
-
Thank you streamline,
That's interesting, I have provided 'searchType', 'searchTerm', 'search', 'cat', 'filter2name', 'filter1name' as URL Parameters
- Are URL Parameters case sensitive?
- Should these be not set as CRAWL - 'Let Googlebot decide' and instead manually given as best practise? It looks like Google is still indexing from what you guys have found.
-
Easy way to be sure is to do a quick search on Google to see if they are ranking. If you know for sure the Parameters make no difference its usually better to specifically signal that through the WMT console. While Google tend to be pretty smart at these kind of things they can always make mistakes so may as well give as much info as possible.
-
Hi there,
I am doing a crawl on the site listed in your profile (www.abdserotec.com) using Screaming Frog SEO Spider using Googlebot as the User Agent, and I am seeing many more URLs than the 14,000 pages you have. The bulk majority of these excess pages are the Search Results pages (such as http://www.abdserotec.com/search.html?searchType=BASIC&searchTerm=STEM CELL FACTOR&cat=&Filter2Name=GO&Filter2Value=germ-cell development&filterCount=2&type=&filter1name=Spec&filter1value=STEM CELL FACTOR). While these URLs are not showing up in the Google Index when you try searching your site with the site: command, Google is still definitely accessing them and crawling them. As Tuzzell just suggested, I also highly recommend configuring the parameters within GWT.
-
We have 49 Parameters listed and given 'Let Googlebot decide'. I thought adding the parameters here would avoid google from indexing those URLs? I believe our setup already does this?
-
What do you mean by "multiple ways"? We have a search page which isn't indexed and internal links from pages but that wouldn't count would it? It's not like the URL string changes from a search page or internal hyperlink?
-
Have you discounted URL parameters through Google Webmaster tools? This would be particularly prevalent for an ecommerce site as if you have not Google could be looking at /page, /page?p=x, /page?p=y etc and counting these as unique pages. This creates obvious dupe content issues and is easily fixed in WMT by going to:
Crawl>URL Parameters
Hope that helps.
-
what about multiple ways of getting to the same product?
-
There are no blog posts, it's an ecommerce site and every product page and article page has the URL www.domain.com/.
I even looked at my GA and it reports 14,000 pages
If there was a tool to export all the search results, I could've manually looked into why the big count.
-
Hi Cyto,
Does that include your blog pages? If you have a blog, such as Wordpress, then it may be picking up the different URL's that each post may have. So for example, you might have the blog post in different categories which would mean the post is accessible from 2 different URL's
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
We are ranking for unexpected industry keywords; not for the main keywords. What might be wrong?
Hi all, Our company falls into a SaaS industry LIKE seo. We used to rank for our industry keywords just like seo, seo software, etc... We have fallen out of rankings for these obvious top keywords with high search volume, but ranking for these 2 keywords like seo homepage and seo website. I think we have some strong ranking authority at Google by ranking for our industry keywords "homepage" and "website" which represents a very relevant site to the industry. But not sure why we are not ranking for top keywords. Any thoughts on this?
Algorithm Updates | | vtmoz1 -
Domain Migration Question
Lets say there is a brand that has one primary product type at different optional tiers. (Think something like SMB/Enterprise/Individual) Lets also say that 1 year ago this brand migrated from having everything under 1 domain (Domain A) to moving 2 of their product tiers to another domain (Domain B), a new domain. They have done some initial SEO work on this domain and had a pretty successful migration but it has also been decided that they are going to no longer offer one of these product tiers and they intend to eventually migrate everything back under the 1 domain (Domain A) They just are not sure whether they should do this now or later.
Algorithm Updates | | DRSearchEngOpt
During this next year or so there is also going to be some likely re-branding/design, etc...stemming from this decision, on the domain, meaning content changes and all that fun that goes into a migration/re-design/re-branding strategy. The timing of this has not been fully decided on. Here is the question: Should they a) Migrate back to Domain A first and then do the re-design or b) Keep 2 separate domains for now, figure out the re-design/re-branding, make content changes and then migrate Site A over in a year or so after all changes have been made? My concern with option a) is that they migrated a little less than 1 year ago and will be migrating back which I feel could have a negative impact on the content and the domain. The positive side I see here is that this impact could be just as large even if we waited so doing this now might be a better, more efficient use of our time if we can migrate and make content changes fairly close together or concurrently.
My concern with option b) is that the tier they no longer offer makes up the majority of that sites business and traffic, leaving us with not much in terms of content that ranks well and garners much traffic. Trying to optimize for the remaining product tier by itself on it's own domain could be quite hard and then having to migrate it in a year or so back to Domain A could negatively impact any small organic impact I can make on applicable pages/domain. Does anybody have any input here? I am leaning towards Option A and but wanted to get some other opinions. Thanks Everybody! Edit: So far, this has received a lot of views but no input. I am hoping to have a bit of a dialog on this so any ideas or input is welcome.0 -
Question about Google Algo Change on June 26
I have a client who's Google Organic visits dropped significantly on June 26th. I used a chart overlay called ChartIntelligence. It says that there was an SEOF update on 6/26/2013. Does anyone know what this update (or any other updates) would be? Also, where might I find additional info on this update. I did notice that Moz's algo change tracker listed a multi-week update on June 27, but I'm not sure where to find info on what types of things were impacted by this update. Any info would be helpful.
Algorithm Updates | | TopFloor0 -
Google "In-Depth Article" Question
Google started featuring "In-Depth Articles" a few days ago. You can read about them here and here. I have two questions about them... If you already hold a great position in the SERPs. Let's say your existing article ranks at #2 or #3. If that article becomes one of the "In-Depth Articles", will it disappear from the #2 or #3 position? I have lots of content that I could mark as an In-Depth Article, but I don't want to do that if it will pull me out of a hard-earned SERP position. Has anyone seen "In-Depth Articles" that do not have the Schema markup? Thanks!
Algorithm Updates | | EGOL1 -
Dumb International SEO question?
Buongiorno from 18 degrees C Wetherby UK... Client asks - "My swedish site is http://www2.kingspanpanels.se/ how important is having the swedish suffix in the url with regards to rankings in Sweden?" I find these questions really challenging, its like the Hey if i change this url my SEO problems will be fixed, as if its that easy. So my question is - "How weighted is the url suffix / ccTLD in terms of SEO success for a territory / country" Put another way "If the swedish suffix .se was removed would it impact rankings in any way in Sweden?" Grazie tanto,
Algorithm Updates | | Nightwing
David0 -
Question regarding research tools
The keyword analysis tool on seomoz is currently down. Are there are any other trustworthy tools I can use?
Algorithm Updates | | uofmiamiguy0 -
"No Follow", C Blocks and IP Addresses combined into one ultimate question?
I think the the theme of this question should be "Is this worth my time?" Hello, Mozcon readers and SEO gurus. I'm not sure how other hosting networks are set up, but I'm with Hostgator. I have a VPS level 5 which (I think) is like a mini personal server. I have 4 IP addresses, although it is a C block as each IP address is off by one number in the last digit of the address. I have used 3 out of the 4 IP addresses I have been given. I have added my own sites (some high traffic, some start-ups) and I've hosted a few websites that I have designed from high paying customers. -one man show, design them, host them and SEO them With the latest Penguin update, and with learning that linking between C Block sites is not a great idea, I have "No Followed" all of the footer links on client sites back to my portfolio site. I have also made sure that there are no links interlinking between any of my sites as I don't see them in the Site Explorer, and I figure if they aren't helping, they may be hurting the rankings of those keywords. Ok, so...my question is: "I have one IP address that I'm not using, and I have a popular high traffic site sharing it's IP with 5 other sites (all not related niches but high quality) Is it worth it to move the high traffic site to it's own IP address even though making the switch would take up to 48hrs for process to take affect? -My site would be down for, at the most 2 days (1 and a half if I switch the IP's at night) Is this really worth the stress of losing readers? Will moving a site on an IP with 5 other sites help the rankings if it was to be on it's own IP? Thank you very much ps- I can't make it to MOZcon this year, super bummed
Algorithm Updates | | MikePatch0 -
Question about Local / Regional SEO
Good Morning Moz Community, I have a local SEO/regional SEO question. I apologize if this question is duplicated from another area on this forum but, a query of the term Regional SEO showed no results, as did similar queries. Please preference this entire question with "Knowing what we know about the most recent changes to local search" I know what has worked in the past, my concern is Now. Working with a heavily regulated client that is regional, mostly East Coast US. They are in Financial Services and state licensing is a requirement. They are licensed in 15 states. Obviously, it would look foolish, in this day in age, to Title Tag individual pages with local modifiers and have numerous pages covering a similar topic with not much difference than localized modifiers in front of the keyword. I've never found that SE's can understand broad regional terms such as New England or Mid Atlantic or Southeast or Northeast, if someone knows different please share. Aside from an exact match search. The client does have 7 offices in various states. Perfectly matching and consistent listings in G Places, Bing Local and Yahoo Local was step one and all their locations are now in those services and there are many more smaller local citation listings are in the works. We have also successfully implemented a plan to generate great reviews from actual customers, for each location, they're receiving a few a day right now. Their local places listings, where they have physical locations, are doing very well but: 1. What would the community's suggestion be on generating more targeted traffic in the 8 states where they have no physical location? 2. The client wants to begin creating smaller blogs that are highly localized to the states and major population centers that they do not have a physical location in. There is an open check book to dedicate to this effort however, I do a lot of work in this industry so I want to offer the best possible, most up to date advice, my concern is that these efforts will have two results: a. be obscured by the ”7 pack" by companies with local brick and mortar b. would detract from the equity built in their existing blog by generating content in other domains, I would prefer to continue growing the main blog. 3. As a follow up, it has been documented that Google is now using the same algorithm for local, personal and personalized, that being the case, is there any value in building links to you Places page? Can you optimize your Places page by using the same off site techniques as you would traditionally? Sorry to kill you with such a long question on a Sunday 🙂
Algorithm Updates | | dogflog1