Crawl erros I don't understand
-
Hi all, after my website is crawled SEMOZ has alerted me about some errors (28 exactly) with the same problem:
At the end of the URL you can see "Piensa_Piensa" which I haven't added at all.
It's present in all URLs that have reported as error by SEOMOZ.
The CMS that has been used to create the website is wordpress.
what does it mean?
Many thanks
-
It's one hell of a Swiss Army knife for webmasters & SEOs huh? Another big fan here too. (But like all powerful tools - ya gotta know what to ask it and how to interpret the results in order to get the most from it - which you did.)
Paul
-
Well done, sir!
-
thanks Paul . I just forgot it...done!
-
Thanks Paul - I can't take most of the credit - got to give it up to the guys at Screaming Frog - love the spider
-
Hey Juan Miguel - Mark did a great job of quickly finding the error for you and explaining how to fix it. Would be great to mark his post as a "helpful answer" to give him a bit of extra credit for his help
-
Nice catch, Mark.
Paul
-
thanks Mark for your help
-
Basically, SEOMoz is crawling your site and telling you they found these errors. There errors were found by following internal links.
I took a quick look at the code on your site, and in the social icon buttons in the header of your pages at the top right, for the Twitter icon, you are linking to yourself, to piensa_piensa - here is the code
class="social-icon"><a <span="" class="webkit-html-attribute-name">target</a><a <span="" class="webkit-html-attribute-name">="_blank" href="</a>Piensa_Piensa ">src="http://piensapiensa.com/wp-content/themes/modernize-v3-11/images/icon/dark/social/twitter.png" alt="twitter"/>
Your href is piensa_piensa - this is adding this text to the URL of every page - correct this link and this error will be fixed - either use # until you link to your Twitter account, or link to your proper address of the twitter account. This will solve your problem.
Good luck,
Mark
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can bots crawl this homepage's content?
The website is https://ashleydouglas.com.au/ I tried using http://www.seo-browser.com/ to see if bots could see the content on the site, but the tool was unable to retrieve the page. I used mobile-friendly test and it just rendered some menu links - no content and images. I also used Fetch and Render on Search Console. The result for 'how google sees the page' and 'how a visitor sees the page' are the same and only showing the main header image. Anything below isn't shown. Does this mean that bots can't actually read all content on the page past the header image? I'm not well versed with what's going on with the code. Why are the elements below the header not rendering? Is it the theme? Plugins? Thank you.
On-Page Optimization | | nhhernandez0 -
What's the best SEO tactics when you have a dedicated web address pointing to a page on a different site?
Hope someone can help with a question I've got about sorting out some duplicate content issues. To simplify the question, imagine there is a website a.com which has a page a.com/newslettersignup. In addition to the a.com domain, there is also a different web address, ashortcut.com, which points to a.com/newslettersignup. ashortcut.com is the web address that is advertised in marketing material etc. So what is the best way then to tell Google etc. that ashortcut.com is the preferred URL for the page which sits at a.com/newslettersignup? The advice I've read about the canonical tag, for example, doesn't cover this exact scenario so although it can support cross-domain information, I'm not sure if that's the best route to follow. Thanks!
On-Page Optimization | | Nobody15755058948220 -
What's the best Magento Community blog extension?
We are looking at FishPig's Word Press Integrations extension. has anybody used it? Possibly a dumb question, but is SEO adversely affected by the fact it's a WordPress extension on a Magento site?
On-Page Optimization | | Anne_Marie_English0 -
How to Structure URL's for Multiple Locations
We are currently undergoing a site redesign and are trying to figure out the best way to structure the URL's and breadcrumbs for our many locations. We currently have 60 locations nationwide and our URL structure is as follows: www.mydomain.com/locations/{location} Where {location} is the specific street the location is on or the neighborhood the location is in. (i.e. www.mydomain.com/locations/waterford-lakes) The issue is, {location} is usually too specific and is not a broad enough keyword. The location "Waterford-Lakes" is in Orlando and "Orlando" is the important keyword, not " Waterford Lakes". To address this, we want to introduce state and city pages. Each state and city page would link to each location within that state or city (i.e. an Orlando page with links to "Waterford Lakes", "Lake Nona", "South Orlando", etc.). The question is how to structure this. Option 1 Use the our existing URL and breadcrumb structure (www.mydomain.com/locations/{location}) and add state and city pages outside the URL path: www.mydomain.com/{area} www.mydomain.com/{state} Option 2 Build the city and state pages into the URL and breadcrumb path: www.mydomain.com/locations/{state}/{area}/{location} (i.e www.mydomain.com/locations/fl/orlando/waterford-lakes) Any insight is much appreciated. Thanks!
On-Page Optimization | | uBreakiFix0 -
My company's product is referred to by two different names (SVN and Subversion). When cleaning up our Title tags, is it OK to use either name to keep the title tags around 70 characters?
I am cleaning up title tags that are too long or not correct. In our title tag we reference our product (a version of OSS source code). This product is often referred to as both SVN or Subversion. When writing Title tags is it OK to use one or the other depending on the length of the Title Tag? For instance: Contact Us | Free SVN & Git Hosting | Bug & Issue tracking | CloudForge vs **About CloudForge | Free Subversion & Git Hosting | Bug Tracking ** | |
On-Page Optimization | | CollabNet0 -
Should stop words be used in titles? I'm aiming for natural title structure.
I have fully optimized on-page SEO for the following keyword (not really the one I use, but it can serve as an example): -personal driver in los angeles Even though "in" is a stop word, I prefer to have a natural (non-robotic) structure for the title - I do this by including "in". I believe that "personal driver los angeles" is too spammy and too robotic. Is this a good or a bad thing?
On-Page Optimization | | zorsto0 -
301 redirected Duplicate Content, still showing up as duplicate after new crawl.
We launched a site where key landing pages were not showing up in google. After running the seomoz crawl it returned a lot of duplicate pages which may expalin this. The actual url of the page is /design and it was telling me the following were dupes: /design/family-garden-design
On-Page Optimization | | iterate
/design/small-garden-design
/design/large-rural-garden-design
/Design All of these URL's were in fact pointing to the /design landing page. I 301 redirected all of the pages so they all now resolve to /design After running another crawl the day after doing this it's still showing up as duplicate content on seomoz. Does seomoz evaluate the new changes right away?0 -
What's the best strategy for reducing the number of links on a blog post?
I'd like to optimize my blog better for search. The first reccomendation I got from my SEOMoz Pro Campaign Crawl was that I needed to reduce the number of links per page on my site. I have lots of links from navigational items in the sidebar that people do click on. I'd really like to keep some or all of the tags and categories I list. Comments are another issue. Most of our posts get about 10 comments. However, our best posts get 50-100 comments. Those comments create a lot of links. I was planning on attempting to reduce the number of links using javascript but I guess Google understands javascript now. I may still do this b/c our pages are huge and some progressive rendering would likely help the user experience. Can you use javascript (ajax or otherwise) to limit the number of links on your page in a way that helps your SEO efforts? Any specific suggestions for reducing links that come from comments and navigational items? How much will reducing the number of links on a given page help with SEO? Any simple way to estimate or quantify this without diving in? Thanks in advance!
On-Page Optimization | | TaitLarson0