Does Google Read URL's if they include a # tag? Re: SEO Value of Clean Url's
-
An ECWID rep stated in regards to an inquiry about how the ECWID url's are not customizable, that "an important thing is that it doesn't matter what these URLs look like, because search engines don't read anything after that # in URLs. " Example http://www.runningboards4less.com/general-motors#!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891
Basically all of this: #!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891
That is a snippet out of a conversation where ECWID said that dirty urls don't matter beyond a hashtag...
Is that true? I haven't found any rule that Google or other search engines (Google is really the most important) don't index, read, or place value on the part of the url after a # tag.
-
Thanks Sachin
So basically on sites that use ECWID for their ecommerce, only the main pages on the actual website (not the product pages that ECWID generates which is the part from the hashtag on) get indexed?
Essentially Google is NOT indexing any products because ECWID uses an existing page on a website and shows products there.
Is that correct? For example if you look at an XML sitemap for the running boards site that we used as an example you will see there are only 10 pages on it. However there are over a 1000 different types of running boards sold on the site which have their own pages populate after a #tag in the url: http://www.runningboards4less.com/index.php?option=com_xmap&view=xml&tmpl=component&id=1
-
Traditionally, the search engines ignore everything after the hash-tag because it's usually content contained on the same page or URL. Therefore, those additional URLs should not get indexed (only the part before the hashtag should). As per my experience, they completely disregard anything after the # tag in a URL.
However, it is always advisable to have clean urls as both SEs and people prefer them over complicated one. Clean urls deliver enhanced usability to help users remember and share your URLs more easily. Another benefit of a simple URL is that other sites are more likely to link to a simple URL, because it is easier to do so.
-
Anyone? Bueller? Bueller?
Also if anyone knows how to modify Ecwid urls so that they are "clean", please chime in...
-
Thank you for your response. I am not implying that it is indexing a "separate" url. I am referring to the SEO value of a proper "clean" url for the specific page. ECWID doesn't allow for it's users to create custom urls.
If I were creating a url for the page I listed above, I would have it something like **** .com/chevy-van NOT _.com/#!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 _
My question regards the low or lack of any value at all using a url like the long one above and if the statement made by the ECWID rep is factual.
-
These URLs are called AJAX URL- a URL containing a hash fragment, e.g.,
www.example.com/index.html#mystate
, where#mystate
is the hash fragment.Reg. the above mentioned URL- This url is using Hash-Bang (#!) not hashtag, which makes Ajax/ javascript pages crawlable. The basic # indicates a location on a page (anchor) so does not get indexed as a separate URL.
You can find detailed information here- https://support.google.com/webmasters/answer/174992?hl=en
https://support.google.com/webmasters/answer/174993
Hope this helps!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google has discovered a URL but won't index it?
Hey all, have a really strange situation I've never encountered before. I launched a new website about 2 months ago. It took an awfully long time to get index, probably 3 weeks. When it did, only the homepage was indexed. I completed the site, all it's pages, made and submitted a sitemap...all about a month ago. The coverage report shows that Google has discovered the URL's but not indexed them. Weirdly, 3 of the pages ARE indexed, but the rest are not. So I have 42 URL's in the coverage report listed as "Excluded" and 39 say "Discovered- currently not indexed." When I inspect any of these URL's, it says "this page is not in the index, but not because of an error." They are listed as crawled - currently not indexed or discovered - currently not indexed. But 3 of them are, and I updated those pages, and now those changes are reflected in Google's index. I have no idea how those 3 made it in while others didn't, or why the crawler came back and indexed the changes but continues to leave the others out. Has anyone seen this before and know what to do?
Intermediate & Advanced SEO | | DanDeceuster0 -
Should I Add Location to ALL of My Client's URLs?
Hi Mozzers, My first Moz post! Yay! I'm excited to join the squad 🙂 My client is a full service entertainment company serving the Washington DC Metro area (DC, MD & VA) and offers a host of services for those wishing to throw events/parties. Think DJs for weddings, cool photo booths, ballroom lighting etc. I'm wondering what the right URL structure should be. I've noticed that some of our competitors do put DC area keywords in their URLs, but with the moves of SERPs to focus a lot more on quality over keyword density, I'm wondering if we should focus on location based keywords in traditional areas on page (e.g. title tags, headers, metas, content etc) instead of having keywords in the URLs alongside the traditional areas I just mentioned. So, on every product related page should we do something like: example.com/weddings/planners-washington-dc-md-va
Intermediate & Advanced SEO | | pdrama231
example.com/weddings/djs-washington-dc-md-va
example.com/weddings/ballroom-lighting-washington-dc-md-va OR example.com/weddings/planners
example.com/weddings/djs
example.com/weddings/ballroom-lighting In both cases, we'd put the necessary location based keywords in the proper places on-page. If we follow the location-in-URL tactic, we'd use DC area terms in all subsequent product page URLs as well. Essentially, every page outside of the home page would have a location in it. Thoughts? Thank you!!0 -
Facets Being Indexed - What's the Impact?
Hi Our facets are from what I can see crawled by search engines, I think they use javascript - see here http://www.key.co.uk/en/key/lockers I want to get this fixed for SEO with an ajax solution - I'm not sure how big this job is for developers, but they will want to know the positive impact this could have & whether it's worth doing. Does anyone have any opinions on this? I haven't encountered this before so any help is welcome 🙂
Intermediate & Advanced SEO | | BeckyKey0 -
Content From One Domain Mysteriously Indexing Under a Different Domain's URL
I've pulled out all the stops and so far this seems like a very technical issue with either Googlebot or our servers. I highly encourage and appreciate responses from those with knowledge of technical SEO/website problems. First some background info: Three websites, http://www.americanmuscle.com, m.americanmuscle.com and http://www.extremeterrain.com as well as all of their sub-domains could potentially be involved. AmericanMuscle sells Mustang parts, Extremeterrain is Jeep-only. Sometime recently, Google has been crawling our americanmuscle.com pages and serving them in the SERPs under an extremeterrain sub-domain, services.extremeterrain.com. You can see for yourself below. Total # of services.extremeterrain.com pages in Google's index: http://screencast.com/t/Dvqhk1TqBtoK When you click the cached version of there supposed pages, you see an americanmuscle page (some desktop, some mobile, none of which exist on extremeterrain.com😞 http://screencast.com/t/FkUgz8NGfFe All of these links give you a 404 when clicked... Many of these pages I've checked have cached multiple times while still being a 404 link--googlebot apparently has re-crawled many times so this is not a one-time fluke. The services. sub-domain serves both AM and XT and lives on the same server as our m.americanmuscle website, but answer to different ports. services.extremeterrain is never used to feed AM data, so why Google is associating the two is a mystery to me. the mobile americanmuscle website is set to only respond on a different port than services. and only responds to AM mobile sub-domains, not googlebot or any other user-agent. Any ideas? As one could imagine this is not an ideal scenario for either website.
Intermediate & Advanced SEO | | andrewv0 -
Does having a ? on the end of your URL affect your SEO?
I have some redirects that were done with at "?" at the end of the URL to include google coding (i.e. you click on an adwords link and the google coding follows the redirected link). When there is not coding to follow the link just appears as "filename.html?". Will that affect us negatively SEO-wise? Thank you.
Intermediate & Advanced SEO | | RoxBrock1 -
What's your daily SEO checklist?
First thing every morning I login to Google Webmaster tools looking for any errors, review data, sites linking to us, etc. I then login to Google Analytics and SEOMOz to check traffic to our terms to see if there have been any changes that need to be addressed. What's your daily checklist?
Intermediate & Advanced SEO | | Prospector-Plastics1 -
How do I presuade Google to re-consider my site?
A few weeks ago I got an emai from Google that my site is suspected to violating Google guidelines-->suspected links manipulationg Google Page rank. My site dropped to the second page. I have contacted some of the top webmasters who link to me and they have removed the links or added a nofollow. When I asked for re-consideation I got an answear that there are still suspected links. What do I do now? I can't remove all of my links?! BTW this happened before the offical Pinguin Update.
Intermediate & Advanced SEO | | Ofer230 -
What's the best .NET blog solution?
I asked our developers to implement a WordPress blog on our site and they feel that the technology stack that is required to support WP will interfere with a number of different .NET production applications on that server. I can't justify another server just because of a blog either. They want me to find a .NET blog solution. The only thing that looks decent out there is dotnetblogengine.net. Has anyone had any experience with this tool or any others like it? Thanks, Alex
Intermediate & Advanced SEO | | dbuckles1