We just can't figure out the right anchor text to use
-
We have been trying everything we can with anchor text. We have read here that we should try naturalistic language. Our competitors who are above us in Google search results don't do any of this. They only use their names or a single term like "austin web design". Is what we are doing hurting our listings? We don't have any black hat links. Here's what we are doing now. We are going crazy trying to figure this out. We are afraid to do anything in fear it will damage our position.
Bob
| pallasart web design | 31 | 1,730 |
| website by pallasart a texas web design company in austin | 15 | 1,526 |
| website by the austin design company pallasart | 14 | 1,525 |
| created by pallasart a web design company in austin texas | 13 | 1,528 |
| created by an austin web design company pallasart | 12 | 1,499 |
| website by pallasart web design an austin web design company | 12 | 1,389 |
| website by pallasart an austin web design company | 11 | 1,463 |
| pallasart austin web design | 9 | 2,717 |
| website created by pallasart a web design company in austin texas | 9 | 1,369 |
| website by pallasart | 8 | 910 |
| austin web design | 5 | 63 |
| pallasart website design austin | -
Thank you both for helping us. We talked about what you wrote this morning and are making changes based on this advice.
-
What more can be said - nailed by EGOL
-Andy
-
website by pallasart a texas web design company in austin
I would keep it really really short. Get the name of your company in there and leave it at that. Why?
Pallasart Web Design is easy to read.
Pallasart Web Design is more memorable.
Pallasart Web Design, used on all of your designs, is a consistent branding message (I hope that is your domain name)
Pallasart Web Design is your brand name and Google doesn't like keyword-rich anchor text in my opinion.
People are going to click through based upon the quality of your work rather than where you are located (in ten years running many sites all of the people I have hired are very far from me because I hire based upon who does work that I respect).
People who click through this type of link are going to do so based upon how much they think you know about Google and I personally think that Google frowns on long keyword-rich anchors for an attribution link.
People are going to click through based upon how good you are at creating links that elicit clicks and I think that short, rather than keyword-rich is more effective at eliciting clicks.
A lot of people really dislike these types of links (search here for heated discussions about them) and they would allow Pallasart Web Design long before they would allow the long messages you provided as samples. Some will not want any attribution links.
Some people are going to check your code and see if you have nofollow on the link and will be more likely to allow the link if it is nofollowed.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should we use Cloudflare
Hi all, we want to speed up our website (hosted in Wordpress, traffic around 450,000 page views monthly), we use lots of images. And we're wondering about setting up on Cloudflare, however after searching a bit in Google I have seen some people say the change in IP, or possible sharing of Its with bad neighbourhoods, can really hit search rankings. So, I was wondering what the latest thinking is on this subject, would the increased speed and local server locations be a boost for SEO, moreso than a potential loss of rankings for changing IP? Thanks!
Technical SEO | | tiromedia1 -
Can you use Screaming Frog to find all instances of relative or absolute linking?
My client wants to pull every instance of an absolute URL on their site so that they can update them for an upcoming migration to HTTPS (the majority of the site uses relative linking). Is there a way to use the extraction tool in Screaming Frog to crawl one page at a time and extract every occurrence of _href="http://" _? I have gone back and forth between using an x-path extractor as well as a regex and have had no luck with either. Ex. X-path: //*[starts-with(@href, “http://”)][1] Ex. Regex: href=\”//
Technical SEO | | Merkle-Impaqt0 -
Strange URL's for client's site
We just picked up a new client and I've been doing some digging around on their site. They have quite the wide variety of URL's that make for a rather confusing experience. One of the milder examples is their "About" page. Normally I would expect something along the lines of: www.website.com/about I see: www.website.com/default.asp?Page=About I'm typically a graphic designer and know basically nothing about code, but I just assume this has something funky to do with how their website was constructed. I'm assuming this isn't particularly SEO friendly, but it doesn't seem too bad. Until I got to another section of their site. It's a section that logically should look like: www.website.com/training/public-seminars It's: www.website.com/default.asp?Page=MT&Area=Seminars&Sub=MRM Now that's nonsensical to me! Normally if a client has terrible URL's, I'd say let's do some redirects, but I guess I'm a little intimidated by these. Do the URL's have to be structured like this for some reason? Am I missing some important area of coding here? However, the most bizarre example is a link back to their website from yellowpages.com. Where normally I would expect it to lead to their homepage, I get this bizarre-looking thing: http://website1-px.rtrk.com/?utm_source=ReachLocal&utm_medium=PPC&utm_campaign=AssetManagement&reference_id=15&publisher=yellowpages&placement=ypwebsitemip&action_target=listing_website And as you browse through the site, that strange domain stays. For example the About page is now: http://website1-px.rtrk.com/default.asp?Page=About I would try to google this but I have no idea where to even start! What is going on with these links? Will we be able to fix them to something presentable without breaking their website?
Technical SEO | | everestagency0 -
Just read Travis Loncar's YouMoz post and I have a question about Pagination
This was a brilliant post. I have a question about Pagination on sites that are opting to use Google Custom Search. Here is an example of a search results page from one of the sites I work on: http://www.ccisolutions.com/StoreFront/category/search-return?q=countryman I notice in the source code of sequential pages that the rel="next" and rel="prev" tags are not used. I also noticed that the URL does not change when clicking on the numbers for the subsequent pages of the search results. Also, the canonical tag of every subsequent page looks like this: Are you thinking what I'm thinking? All of our Google Custom Search pages have the same canonical tag....Something's telling me this just can't be good. Questions: 1. Is this creating a duplicate content issue? 2. If we need to include rel="prev" and rel="next" on Google Custom Search pages as well as make the canonical tag accurate, what is the best way to implement this? Given that searchers type in such a huge range of search terms, it seems that the canonical tags would have to be somehow dynamically generated. Or, (best case scenario!) am I completely over-thinking this and it just doesn't matter on dynamically driven search results pages? Thanks in advance for any comments, help, etc.
Technical SEO | | danatanseo1 -
I can buy a domain from a competitor. Whats the best way to make good use of these links for my existing website
I can buy a domain from a competitor. Whats the best way to make good use of these links for my existing website
Technical SEO | | Archers0 -
Ignore url parameters without the 'parameter=' ?
We are working on an ecommerce site that sorts out the products by color and size but doesn't use the sortby= but uses sortby/. Can we tell Google to ignore the sortby/ parameter in Webmaster Tools even though it is not followed by an = sign? For example: www.mysite.com/shirts/tshirts/shopby/size-m www.mysite.com/shirts/tshirts/shopby/color-black Can we tell WMT to ignore the 'shopby/' parameter so that only the tshirts page will be indexed? Or does the shopby have to be set up as 'shopby=' ? Thanks!
Technical SEO | | Hakkasan0 -
We have a decent keyword rich URL domain that's not being used - what to do with it?
We're an ecommerce site and we have a second, older domain with a better keyword match URL than our main domain (I know, you may be wondering why we didn't use it, but that's beside the point now). It currently ranks fairly poorly as there's very few links pointing to it. However, the exact match URL means it has some value, if we were to build a few links to it. What would you do with it: 301 product/category pages to current site's equivalent page Link product/category pages to current site's equivalent page Not bother using it at all Something else
Technical SEO | | seanmccauley0 -
How do I use the Robots.txt "disallow" command properly for folders I don't want indexed?
Today's sitemap webinar made me think about the disallow feature, seems opposite of sitemaps, but it also seems both are kind of ignored in varying ways by the engines. I don't need help semantically, I got that part. I just can't seem to find a contemporary answer about what should be blocked using the robots.txt file. For example, I have folders containing site comps for clients that I really don't want showing up in the SERPS. Is it better to not have these folders on the domain at all? There are also security issues I've heard of that make sense, simply look at a site's robots file to see what they are hiding. It makes it easier to hunt for files when they know the directory the files are contained in. Do I concern myself with this? Another example is a folder I have for my xml sitemap generator. I imagine google isn't going to try to index this or count it as content, so do I need to add folders like this to the disallow list?
Technical SEO | | SpringMountain0