Herbal Viagra page same DA/PA as UC Berkeley??
-
Either there is some amazingly good SEO work going on here, or Google has an amazingly large hole in their metrics.
http://www.ucdavis.edu/index.html
The "nottowait" page has a PA of 85?! and a DA of 82?!
The page is HORRIBLE. The page itself is an image of another page. The nav bar does not function, nor does any of the "click here" links. At the bottom there is a paragraph of keywords and broken english.
This page is pure junk and should simply not have any value at all with respect to DA nor PA.
It has a ton of incoming links from various sources which seem to be the source of all this value, which it passes on to other pages. This page really is an affront to the "content is king" concept.
I suppose I should ask a question but all I can think of is, what is Matt Cutts' phone number? I want to ask him how this page has gotten away with being ranked so well for so long.
-
I believe that this is a clear demonstration that link metrics are a decoy that pulls you away from what is really important.
-
You can always report it here - http://www.google.com/support/webmasters/bin/answer.py?answer=93713.
It just goes to show that this stuff works still though.
-
All of the high DA linking sites appear to be chinese, japanese and other asian sites, including government sites.
Take their 88 DA link. It is a blog comment as follows:
"<cite class="fn">John1720</cite> 说:
Aloha!
http://nottowait.com ,vigrx, http://igrkio.info ,buy valium, http://www.ritmolatino.org ,buy adipex, http://www.robboranx.com ,buy vimax, http://propecia.dailyobjectivist.com ,propecia no prescription,:"It's really frustrating how Google's algorithms can detect small details but completely miss something like this for years.
-
With site speed now showing on webmaster tools I'd figure it get a ranking boost for 'performance'.
It does have a lot of linking root domains 2,665 and inbound followed links 13,799 from sites with upto 84 DA, which it probably owns.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
High Rank and Traffic of low DA and Backlinks
Hi guys, is a pleasure being a part of this community, hope in learning a lot with you guys, i just started a year learning about SEO and it been a big journey. I was looking at some competition of some websites that i been optmizing, and i found a website that called my attention and i cant figure out whats going on, it haves huge traffic but in terms of technicall SEO is really week, and not just this but also in terms of DA and backlinks (most of them spammy - 20 backlinks), the domain in question is bhnews.com.br I notice that doesnt have any social media, not analytics, etc. The only thing that i notice is that there is a website or a company called "BH news" (televesion), but its not related with it, since the type of information that bhnews.com.br presents is "lottery" results. So this kind of situation confuses me a lot, because is a lot of hard work in optmizing a website to rank in google, and than i come a across with this type of website with 20 backlinks (most of anchor or name of domain), and than haves like 2M visits per month and ranks for keywords related with the this type of sites of lottery. Can someone tell me if there is some kind of black seo, or something that is making this rank so high? regards
White Hat / Black Hat SEO | | jogobicho0 -
My site in 2 page
my site in 2 page how can i rank with this keywords in dubai legal translation in Dubai
White Hat / Black Hat SEO | | saharali150 -
Recovering from Google Penguin/algorithm penalty?
Anyone think recovery is possible? My site has been in Google limbo for the past 8 months to around a year or so. Like a lot of sites we had seo work done a while sgo and had tons of links that Google now looks down on. I worked with an seo company for a few months now and they seem to agree Penguin is the likely culprit, we are on page 8-10 for keywords that we used to be on page 1 for. Our site is informative and has everything in tact. We deleted whatever links possible and some sites are even hard to find contact information for and some sites want money, I paid a few a couple bucks in hopes maybe it could help the process. Anyway we now have around 600 something domains on disavow file we out up in March-April, with around 100 or 200 added recently as well. If need be a new site could be an option as well but will wait and see if the site can improve on Google with a refresh. Anyone think recovery is possible in a situation like this? Thanks
White Hat / Black Hat SEO | | xelaetaks0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
301 redirect a set of pages to one landing page/URL?
I'm planning to redirect the following pages to one new URL/landing page: Old URLs: http://www.olddomain.com/folder/page/1 http://www.olddomain.com/folder/page/2 http://www.olddomain.com/folder/page/3 http://www.olddomain.com/folder/page/4 http://www.olddomain.com/folder/page/5 http://www.olddomain.com/folder/page/6 New URL: http://www.newdomain.com/new-folder/new-page Code in .htaccess that I will be using: RedirectMatch 301 /folder/page/(.*) http://www.newdomain.com/new-folder/new-page Let me know if this is correct. Thanks!
White Hat / Black Hat SEO | | esiow20130 -
Has anyone used this? www.linkdetox.com/
Has anyone used this? www.linkdetox.com/ Any opinions about it?
White Hat / Black Hat SEO | | Llanero0 -
How best to do Location Specific Pages for Eccomerce Post Panda Update..
Hi , We have an eCommerce site and currently we have a problem with duplicate content. We created Location specific landing pages for our product categories which initially did very well until the recent Google Panda update caused a big drop in ranking and traffic. example http://xxx.co.uk/rent/lawn-mower/London/100 http://.xxx.co.uk/rent/lawn-mower/Manchester/100 Much of the content on these location pages is the same or very similar apart from different H1 tag, title tag and in some cases slight variations on the on page content but given that these items can be hired from 200 locations it would take years to have unique content for every location for each category... We did this originally in April as we can't compete nationally but we found it was easier to compete locally ,hence the creation of the location pages and it did do well for us until now. My question is , since the last Google Panda update, our traffic has dropped 40% , and rankings have gone through the floor and we are stuck with this mess Should we get rid off (301 ) all of the location specific pages for each of the categories ot just keep say 10 locations per cities as the most popular ones and either do No follow no index the other locations or 301's or what would people recommend ? The only examples I can see on the internet that others do with multiple locations is to have a store finder type thing... but you cant' rank for the individual product /category doing it that way... If anyone has any advice or good examples of sites I could see that employ a good location url specific method, please let me know. thanks Sarah
White Hat / Black Hat SEO | | SarahCollins0