Question #1 - My Cherry's Popped!
-
I recently acquired rights to a URL that is one of our keywords. Instead of developing a landing page with that URL and then only linking it back to the company root, I was thinking about adding a link within the company's global nav that pushes to this new URL (and new page content of course).
Are there any Pros or Cons to doing it that way?
Thank you so much!
-
You are very welcome!
-
That second half is very interesting - regarding pre-established inbound links. I will certainly look into that. Thank you for the replies Dana - great help indeed.
-
I do think that it has disappeared from browser tool bars. I always check it manually here http://www.prchecker.info/check_page_rank.php Given that there isn't much authority established on this new URL I would link to it from your existing domain as long as the new domain doesn't have any insidious (nefarious?) inbound links from its past. I am interested to know what others think.
Dana
-
Is it just me or has PR been removed from all browser tool bars? So, I don't know the answer.
There are some links on the current URL, but they are just sponsored ads.
The URL has been established since 2007.
-
Does this new URL have any domain authority, backlinks or PageRank?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Infinite Scrolling on Publisher Sites - is VentureBeat's implementation really SEO-friendly?
I've just begun a new project auditing the site of a news publisher. In order to increase pageviews and thus increase advertising revenue, at some point in the past they implemented something so that as many as 5 different articles load per article page. All articles are loaded at the same time and from looking in Google's cache and the errors flagged up in Search Console, Google treats it as one big mass of content, not separate pages. Another thing to note is that when a user scrolls down, the URL does in fact change when you get to the next article. My initial thought was to remove this functionality and just load one article per page. However I happened to notice that VentureBeat.com uses something similar. They use infinite scrolling so that the other articles on the page (in a 'feed' style) only load when a user scrolls to the bottom of the first article. I checked Google's cached versions of the pages and it seems that Google also only reads the first article which seems like an ideal solution. This obviously has the benefit of additionally speeding up loading time of the page too. My question is, is VentureBeat's implementation actually that SEO-friendly or not. VentureBeat have 'sort of' followed Google's guidelines with regards to how to implement infinite scrolling https://webmasters.googleblog.com/2014/02/infinite-scroll-search-friendly.html by using prev and next tags for pagination https://support.google.com/webmasters/answer/1663744?hl=en. However isn't the point of pagination to list multiple pages in a series (i.e. page 2, page 3, page 4 etc.) rather than just other related articles? Here's an example - http://venturebeat.com/2016/11/11/facebooks-cto-explains-social-networks-10-year-mission-global-connectivity-ai-vr/ Would be interesting to know if someone has dealt with this first-hand or just has an opinion. Thanks in advance! Daniel
White Hat / Black Hat SEO | | Daniel_Morgan1 -
I have plenty of backlinks but the site does not seem to come up on Google`s first page.
My site has been jumping up and down for many months now. but it never stays on Google first page. I have plenty of back-links, shared content on social media. But what could i be doing wrong? any help will be appreciated. Content is legit. I have recently added some internal links is this might be the cause? Please help .
White Hat / Black Hat SEO | | samafaq0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Creating pages as exact match URL's - good or over-optimization indicator?
We all know that exact match domains are not getting the same results in the SERP's with the algo changes Google's been pushing through. Does anyone have any experience or know if that also applies to having an exact match URL page (not domain). Example:
White Hat / Black Hat SEO | | lidush
keyword: cars that start with A Which way to go is better when creating your pages on a non-exact domain match site: www.sample.com/cars-that-start-with-a/ that has "cars that start with A" as the or www.sample.com/starts-with-a/ again has "cars that start with A" as the Keep in mind that you'll add more pages that start the exact same way as you want to cover all the letters in the alphabet. So: www.sample.com/cars-that-start-with-a/
www.sample.com/cars-that-start-with-b/
www.sample.com/cars-that-start-with-C/ or www.sample.com/starts-with-a/
www.sample.com/starts-with-b/
www.sample.com/starts-with-c/ Hope someone here at the MOZ community can help out. Thanks so much0 -
Google penalty having bad sites maybe and working on 1 good site ?!!!
I have a list of websites that are not spam.. there are ok sites... just that I need to work on the conent again as the sites content might not be useful for users at 100%. There are not bad sites with spammy content... just that I want to rewrite some of the content to really make great websites... the goal would be to have great content to get natual links and a great user experience.. I have 40 sites... all travel sites related to different destinations around the world. I also have other sites that I haven't worked on for some time.. here are some sites: www.simplyparis.org
White Hat / Black Hat SEO | | sandyallain
www.simplymadrid.org
www.simplyrome.org etc... Again there are not spam sites but not as useful as they coul become... I want to work on few sites only to see how it goes.... will this penalise my sites that I am working on if I have other sites with average content or not as good ? I want to make great content good for link bait 🙂0 -
Ethical question about setting up a local business directory...
I've just launched a local business directory that allows the small businesses in my local area to get noticed by a very targeted audience. People such as self-employed builders, painters and decorators and the like. As well as helping them out, they'll be helping me out by testing the waters of local keywords and seeing just how difficult and how much traffic they'll get. Kind of like spreading myself across 100's of keywords without the need for domains etc. All the links on there are nofollow, but what are the ethics behind letting my clients have a dofollow link from it? I wouldn't use this directory as a marketing tool for my self and my business, but if I acquired a client from my website or other source, but they would benefit from a dofollow link?
White Hat / Black Hat SEO | | jasonwdexter0 -
Site ranking position 1, after 1 day existing
Hi, I'm working in the Online gaming and since a few months there was a website called 'Htmlwijzer.nl', it ranked page 1 for 'online casino' in Google.nl which is of course a high competive keyword and remained there for about 2 months. This website didn't come up slowly in Google.nl from page 10 to 1, it just was there one day at page 1. This one only had HTML-related information content and did rank for 'online casino' with a layer ad on it's site which would only be showed to users in the Netherlands. Now that website received a penalty after 2 months, and since today a new site is in Google at position 1, called www.casinowijzer.nl. It has completely no backlinks (online a 301 from the former domain). It just popped up there.. Does anyone know how this website could have gotten here? It's obviously blackhat, but for a keyword like 'online casino' it's quite amazing. Thanks,
White Hat / Black Hat SEO | | iwebdevnl0 -
Can our white hat links get a bad rap when they're alongside junk links busted by Panda?
My firm has been creating content for a client for years - video, blog posts and other references. This client's web vendor has been using bad links and link farms to bolster rank for key phrases - successfully. Until last week when Google slapped them. They have been officially warned on WMT for possibly using artificial or unnatural links to build PageRank. They went from page one of the most popular term in Chicago for their industry where they had been for over a year - to page 8 - overnight. Other less generic terms that we were working on felt the sting as well. I was aware of and had warned the client of the possibility of repercussions from these black hat tactics (http://www.seomoz.org/blog/how-google-makes-liars-out-of-the-good-guys-in-seo#jtc170969), but didn't go as far as to recommend they abandon them. Now I'm wondering if one of our legitimate sites (YoChicago.com), which has more than its share of the links into the client site is being considered a bad link. All of our links are legitimate, i.e., anchor text equals description of destination, video links describe the entity that is linked to. Our we vulnerable? Any insight would be appreciated.
White Hat / Black Hat SEO | | mikescotty0