On Site Errors
-
HI Folks
I'm monitoring a small Australian site bluetea.com.au . Currently I have a SEO specialist who does month onsite maintenance work for this site. However each month I continue to see errors in webmaster tools... as an example, currently webmaster tools suggest we have 21 short meta dic and 26 duplicate title tags.....
Examples given are
Short Meta Discp
/cleaning-products-and-our-health/toxic-cleaners/
/colour-consultant/fushia-door/
/portfolio/parisian-apartment-black-kitchen/parisian-apartment-black-kitchen/
Duplicate title tags
<a id="zip_0-anchor" class="zippedsection_title"></a>concrete-kitchen | Blue Tea Kitchen Designs
/kitchen-trends-and-material-innovation/concrete-kitchen-2//kitchen-trends-for-2013/concrete-kitchen/<a id="zip_1-anchor" class="zippedsection_title"></a>Potts Point Kitchen | Blue Tea Kitchen Designs/portfolio/potts-point-kitchen//portfolio/potts-point-kitchen/pott-point-kitchen/My SEO tells me that he has solved all these issues but after one or two months they still remain in webmaster tools... can anybody help me understand why?Thank you
-
Your SEO is right - even if they are updated, it can take a while before they disappear in Webmaster Tools. Google is not really best in class to update these warnings. I did a quick crawl of your site with Screaming Frog, and I couldn't find Duplicate Page Titles or Meta descriptions. 20 pages don't have a metadescription (/tag/ pages)
For a site of 89 HTML pages, the tool had to crawl a lot of resources though - you could consider to put a 'nofollow' tag on the reply links in your comments (the url's of type http://www.bluetea.com.au/sydney-kitchen-companies-a-buyers-guiAde/?replytocom=383 are not indexed, but the links to these pages are followed which ads no value, and is just wasting google bot's time)
Some of the examples you give are extremely light on content (http://www.bluetea.com.au/portfolio/potts-point-kitchen/, http://www.bluetea.com.au/cleaning-products-and-our-health/toxic-cleaners/ http://www.bluetea.com.au/colour-consultant/fushia-door/ , you could consider to add some text to these pages, and at least add a alt tag to the images you show.
While visually very attractive, 27 images are above 100k - check if you can compress them (think about your poor mobile visitors). Page speed seems to be quite good, noticed however your use a lot of javascript (23% of bytes transferred is javascript) - a bit strange as your site doesn't seem to be that complex.
Also check your external links, 4 of them lead to pages that do not exist anymore.(http://www.cosmit.it/tool/home.php?s=0,2,67,71,78, http://www.sampfordixl.com.au/neff/t44t97.html, http://www.kitchenwaresuperstore.com.au/, http://www.forbo-flooring.com.au/)
rgds,
Dirk
-
The links appear to be pages that were automatically generated when an image was attached to a page or post in WordPress. This is fairly common with WordPress. Fortunately, these pages have the following meta tag, which tells the search engines that they should not be indexed or crawled.
<meta name="robots" content="noindex,nofollow"></meta name="robots" content="noindex,nofollow">
-
Looking at that first example here's the meta description code:
So the Meta Description for that page is "nofollow7" plus Google is being told not to index that page. Is that supposed to be the case? If yes, you could remove the URL in GWT. If no, you'll want to get that cleaned up.
The in head css also isn't a best practice. Have you ran these through Google's Page Speed tool and other markup validators?
P.S. Meta Tags that Google Understands: https://support.google.com/webmasters/answer/79812
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can service request pages be indexed for a service site?
I think there is no point in indexing service request pages for a service site. And it causes the indexing of the main pages to be done with a delay. Does anyone have experience with indexing service request pages and their results?
On-Page Optimization | | sora.ya04680 -
Site Wide Links
Howdy Moz! So our agency has been around for long enough to have a few sites we've built that have our credit in their footer resulting in a site wide link. Mostly just our name. We've heard that Google does not particularly like site wide links, should we go through and remove some of these old links?
On-Page Optimization | | wearehappymedia0 -
What are the Best On-Site SEO Practices before an E-commerce Site Goes Live?
Hello, I’m working on a client’s E-commerce website. This website is not live yet. Before the site goes live, I am curious to know what the best practices of On-site SEO are. Please let me know from which factor should I start analyze? Thanks.
On-Page Optimization | | TopLeagueTechnologies0 -
SEO and multilanguage site
Hi all! I have used a wordpress plugin called WPML which translates a webpage into another language so that I have a webpage in two different languages (spanish (main market) and english). I'm just doing the seo for the spanish market and I'm gonna start with the seo for the english one. Should I do it just the same as I had a one-single-language page? just with english keywords, etc. I guessit would only differ in the way I do the linkbuilding strategy as the markets are different Thanks
On-Page Optimization | | juanmiguelcr0 -
Spanish version of site - best practice?
I need to create a Spanish version of an existing site. My idea was to have the Spanish content switch out the English content if the query string had something like ?l=es. It would also drop a cookie so that all other pages would switch out content as well. I do want the Spanish content to be indexed and rank in the search engines, though. I would include all of the Spanish versions (with the ?l=es) in the site map and link to them on every page with a link to the Spanish version. Does anyone have any experience with this? Is this a bad idea? Thanks! Tom
On-Page Optimization | | TomBristol0 -
Hi.. Can a E-commerce site have a Google Authorship.
Hi Can a E-commerce site have a Google Authorship, and if yes i have learned google requires a face for Google Authorship, as they are applying Facial recognition with the authorship. If So, how can an e-commerce website have an individual's face. ?
On-Page Optimization | | usef4u0 -
Large Site - Advice on Subdomaining
I have a large news site - over 1 million pages (have already deleted 1.5 million) Google buries many of our pages, I'm ready to try subdomaining http://bit.ly/dczF5y There are two types of content - news from our contributors, and press releases. We have had contracts with the big press release companies going back to 2004/5. They push releases to us by FTP or we pull from their server. These are then processed and published. It has taken me almost 18 months, but I have found and deleted or fixed all the duplicates I can find. There are now two duplicate checking systems in place. One runs at the time the release comes in and handles most of them. The other one runs every night after midnight and finds a few, which are then handled manually. This helps fine-tune the real-time checker. Businesses often link to their release on the site because they like us. Sometimes google likes this, sometimes not. The news we process is reviews by 1,2 or 3 editors before publishing. Some of the stories are 100% unique to us. Some are from contributors who also contribute to other news sites. Our search traffic is down by 80%. This has almost destroyed us, but I don't give up easily. As I said, I've done a lot of projects to try to fix this. Not one of them has done any good, so there is something google doesn't like and I haven't yet worked it out. A lot of people have looked and given me their ideas, and I've tried them - zero effect. Here is an interesting and possibly important piece of information: Most of our pages are "buried" by google. If I dear, even for a headline, even if it is unique to us, quite often the page containing that will not appear in the SERP. The front page may show up, an index page may show up, another strong page pay show up, if that headline is in the top 10 stories for the day, but the page itself may not show up at all - UNTIL I go to the end of the results and redo the search with the "duplicates" included. Then it will usually show up, on the front page, often in position #2 or #3 According to google, there are no manual actions against us. There are also no notices in WMT that say there is a problem that we haven't fixed. You may tell me just delete all of the PRs - but those are there for business readers, as they always have been. Google supposedly wants us to build websites for readers, which we have always done, What they really mean is - build it the way we want you to do it, because we know best. What really peeves me is that there are other sites, that they consistently rank above us, that have all the same content as us, and seem to be 100% aggregators, with ads, with nothing really redeeming them as being different, so this is (I think) inconsistent, confusing and it doesn't help me work out what to do next. Another thing we have is about 7,000+ US military stories, all the way back to 2005. We were one of the few news sites supporting the troops when it wasn't fashionable to do so. They were emailing the stories to us directly, most with photos. We published every one of them, and we still do. I'm not going to throw them under the bus, no matter what happens. There were some duplicates, some due to screwups because we had multiple editors who didn't see that a story was already published. Also at one time, a system code race condition - entirely my fault, I am the programmer as well as the editor-in-chief. I believe I have fixed them all with redirects. I haven't sent in a reconsideration for 14 months, since they said "No manual spam actions found" - I don't see any point, unless you know something I don't. So, having exhausted all of the things I can think of, I'm down to my last two ideas. 1. Split all of the PRs off into subdomains (I'm ready to pull the trigger later this week) 2. Do what the other sites do, that I believe create little value, which is show only a headline and snippet and some related info and link back to the original page on the PR provider website. (I really don't want to do this) 3. Give up on the PRs and delete them all and lose another 50% of the income, which means releasing our remaining staff and upsetting all of the companies and people who linked to us. (Or find them all and rewrite them as stories - tens of thousands of them) and also throw all our alliances under the bus (I really don't want to do this) There is no guarantee this is the problem, but google won't tell me, the google forums are crap, and nobody else has given me an idea that has helped. My thought is that splitting them off into subdomains will have a number of effects. 1. Take most of the syndicated content onto subdomains, so its not on the main domain. 2. Shake up the Domain Authority 3. Create a million 301 redirects. 4. Make it obvious to the crawlers what is our news and what is PRs 5. make it easier for Google News to understand Here is what I plan to do 1. redirect all PRs to their own subdomain. pn.domain.com for PRNewswire releases bw.domain.com for Businesswire releases etc 2. Fix all references so they use the new subdomain Here are my questions - and I hope you may see something I haven't considered. 1. Do you have any experience of doing this? 2. What was the result 3. Any tips? 4. Should I put PR index pages on the subdomains too? I was originally planning to keep them on the main domain, with the individual page links pointing to the actual release on the subdomain. Obviously, I want them only in one place, but there are two types of these index pages. a) all of the releases for a particular PR company - these certainly could be on the subdomain and not on the main domain b) Various category index pages - agriculture, supermarkets, mining etc These would have to stay on the main domain because they are a mixture of different PR providers. 5. Is this a bad idea? I'm almost out of ideas. Should I add a condensed list of everything I've done already? If you are still reading, thanks for hanging in.
On-Page Optimization | | loopyal0 -
Site Downtime - Will pages be reindexed?
I recently had site downtime of about 10 days for a site I'm working on which is about 6 months old. There were naturally many page not found errors in webmaster tools as google had tried to crawl the site during the downtime but I've noticed that google has now dropped some of the pages. Are they likely to be reindexed? Google has crawled some of my pages today but as yet the ones that have been dropped from the index haven't reappeared. (the site has been live again for a week). Should they reappear when the pages are next crawled?
On-Page Optimization | | SamCUK0