The author's views are entirely his or her own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.
SEOs have powerful metrics at their disposal to measure the success of their strategies, such as Domain Authority (DA) and Page Authority (PA). But how best to use them? In today's Whiteboard Friday, Tom shows you how to think about these metrics as part of a holistic approach to your link building analysis.
Happy Friday, Moz fans, and today's Whiteboard Friday is about measuring link building. So obviously this is a very big and very old topic in the SEO space, and it's one that Moz, as a company, is heavily invested in, right? Like Domain Authority and Page Authority are two very popular products of ours, which are commonly used for this exact purpose.
Now this isn't going to be advertorial, though. I could stand here and just say obviously these are the best metrics in the world and that kind of thing. That's not what I'm here to do. I'm here to give you a bit of nuance about how and when to use these metrics and how to think about them, and how to use them alongside other metrics as well, rather than just having one tool and saying it's a solution to all problems, which isn't necessarily fair.
So to do that, I'm actually going to start by going right back to 1998 and Google's PageRank model. Now I know that a lot has changed since 1998, both with the world and with Google. But this was Google's original way of thinking about links, and in a lot of ways it's still the best that we have to go on. A lot of current SEO best practices and dogma are still based on this original understanding, except there are a few things we've sort of picked up along the way that don't really have a basis in anything that Google has said or done, which is part of why I want to sort of point them out.
So PageRank originally was a way of using links to estimate the probability that a user is on a page, and that's already quite interesting, because that shows that this is a model that is about popularity. So when we talk about this now, we often talk about things like trust and authority and this kind of thing. I'm sure those are relevant, but it's worth remembering that originally this was just a way of estimating effectively the popularity of a page.
Note that I said of the page as well, not even the domain. So imagine a world where there's one page on the internet, which is Page A that I've labeled here. Now if there's one page on the internet, it's not that hard to estimate the chance that a random browser is on that page. It's a certainty they're on that page. If we introduce a second page, it's still not that hard, and we just assume it's going to be 50-50 and so on and so forth.
That's sort of the baseline probability that we have to work with. But then we can take a sort of bit of a tangent or a bit of a spice added to the situation when one page links to another, and that's obviously what we're actually interested in. So if A links to this second page and at the moment there are still only two pages on the internet, ignore these other boxes, they'll come in later, there are only two pages on the internet and A links to the second page.
We say that 0.85 times this probability is passed on. Now 0.85 is a fairly arbitrary sort of constant. It's one that comes from an old Google document. It probably isn't that exact value, but it's fine for illustrative purposes, and it's the best we've got to go on.
So, in this case, why have we said 0.85 by the way? Why haven't we said that all of the users on this page click through? Well, that's because we assume that some of them are going to go and do their own thing, stop browsing the internet, do something else. It turns out that this damping factor is quite important in a world where pages do actually link to each other in a big web rather than just one link in one direction.
So that's all well and good, right? What if we had a second link and introduced a third page to the internet? So this is still a very simplistic model. We've got an internet with three pages and two links, and the links only go in one direction.
This is very, very simple. But in this case we say we can't have both of these pages getting the full probability. No, the users aren't clicking through to both. They're clicking through to one of them. So that gets half of 0.85A. But then this one does too.
Again, in a more complex model, we might say, oh, one of these links is more likely to be clicked on, so it gets more probability or something like that. But in this simple version, we're saying it's split two ways. Now, in this case, we've already learned something interesting again, because by adding another link we've reduced the value of the existing links and that's something that we hardly ever think about in a link building context.
But that is sort of what we're thinking about when in technical SEO conversations we talk about not having too many links in the top nav and this kind of thing. We're trying to focus our strength where we most want it. Then, lastly, I promise the [indecipherable] will stop soon. Lastly, what if we had another jump in this system? Well, in this case, this 0.85, this damping happens again.
So 0.85 times 0.85 is about 0.72, so it's less. So basically it's 0.85 times this page above it, and so it's gotten even lower. This is why, as technical SEOs, sometimes we get caught up with things like chain redirects and this kind of thing, why we think that's important.
That's where that sort of dogma comes from. So I'm not going to go any further with this sort of simplified PageRank explanation. What I am trying to draw to your attention here is a few things. One is that there's a lot about the specifics of a page here that affects the value of these links, like the number of links that the page sent outwards and also things like what linked to the specific page.
Note that I didn't say anything about domains here. This could be on four different domains. It could be on one domain. We only talked about page specifics here. Google has been a little bit ambiguous over time in terms of how they think about pages versus domains. But broadly speaking, they say they care about pages, not domains. So that's interesting, right, because these could all be on the same domain conceivably and yet this page could potentially be a lot weaker and pass on a lot less strength than this one.
So that's interesting, and that's something we don't normally think about with link building. So if we bring this back on topic to what I said I was going to talk about, actual metrics for link building, there are a few qualities that we're looking for.
Now what I haven't just talked about is these first two. We do want metrics that are fast. We want it to be available as quickly as possible so we can report to our client or our boss or that kind of thing and also just we're busy people. We don't want to waste our time.
We want metrics that are ubiquitous, so when I do say to my boss, "Oh, I've got you a link which had DA 90," there's a good chance that he or she or they know what that means. Whereas if I say it had a Tom Capper score of 38B, they're going to say, "What are you talking about?" So I do need to use a metric that's reasonably well understood.
But then there's this page and link level specifics that I just talked about. So if I think about a metric like Domain Authority, it does very well on these first two and it does okay on this third one, because it is trained on rankings to some degree, which is some of what this is determining.
So there's some benefit there. It does take into account some of this stuff, but ultimately it's a domain level metric. So it has to treat all the pages on one domain equally by definition. That produces some pros and cons.
So what I want to do is I want to put some metrics on a chart like this and suggest how you might use them alongside each other.
So I've got actual as the vertical axis here. So the closer it is to what we're actually trying to measure, which is Google's view of the value of the link basically, the further up it's going to be. But then I've also got this fast/slow sort of convenience metric. So a metric like Domain Authority is probably somewhere here. It's very fast.
It's very ubiquitous. But it's missing some of this nuance because it's a domain level metric and it's answering a slightly different question. DA is designed to answer the question, "How likely is a page on this domain, all things being equal, to rank well?" That's a slightly different question to how valuable is the link. But if I'm saying, oh, I want DA, but not necessarily domain level, you might say, "Oh, well, Moz has a metric for that and you should know and it's called Page Authority."
Well, yeah, that is a good candidate. So like most page level metrics in the industry, including Google's and including our own, Page Authority is initially informed by some domain level factors as well as page level factors. We've done correlation studies and this kind of thing.
It is a lot closer to measuring the value and ranking potential of a specific page than the Domain Authority is, as you would expect, because it's a more precise metric and it is capturing some of this nuance. But actually you can go a step further with this as well. Now Page Authority is a bit slower than Domain Authority because you have to wait for Moz to discover and crawl the page.
We do our best, but it's not instant. However, if you're willing to wait even longer than that, you could use a metric like referral traffic. Apologies for my absolutely awful writing there.
So with referral traffic, what we're interested in is how many people actually click through from the link that I built to my site. That's interesting because that's what Google was actually trying to measure in the first place. So if we can measure that, then we're getting pretty close to whatever they were aiming for.
So whatever sophistication they've built in, we're sort of capturing that nuance. Now that has some obvious drawbacks. One is that a lot of link building campaigns don't do very well on this metric, and you can draw your own conclusions about that. The other is that you're obviously going to have to wait quite some time for this data to become available, and even then there might be issues with the client's analytics or this kind of thing. Anyway, that's what I wanted to share with you today.
Essentially what I would suggest is that you use all of these metrics and some others that you could put yourself on this chart. So I'm interested to hear what metrics you would use and where you would draw them on this kind of a chart. I put these green lines in as sort of a guide because I think you could do prospecting in this first section, like before you've even built the link, and then initial reporting to the client.
Then this section would more be after the campaign, when you want to learn from it and think about what kind of links you would build in the future and whether you would do the same sort of thing again. But yeah, I'd love to hear your ideas. Thank you very much and Happy Friday.