The majority of link building posts I see focus on acquiring new links, but what about your existing links? Once you have a strong link profile, this needs to be monitored and consolidated. You don’t want your best links to drop off. This post looks at ‘link decay’ also know as ‘web decay and ‘link rot’.
What is link decay
‘Web decay’ in information retrieval science refers to the loss of pages from the Internet. Back in 2006 Bill Slawski blogged about a patent that looked at link decay, where web pages were found to disappear from the web at a rate of 0.5% per week.
Worldwidewebsize calculates total pages on the Internet at 19.64 Billion, while SEOmoz say the average page has 13.32 external links on it. By my calculations that’s 1.3 Billion links being wiped out each week!
Here I want to split the terms. Web/link decay being the loss of pages from the Internet, but ‘link equity decay’ to mean the deterioration of link equity over time or the link being totally lost.
Link Equity, measuring the juice
“Link equity is the concept of the influence of links on a pages ability to rank for particular search queries. Link equity takes into account things like relevance, authority and trust, link placement and accessibility, the positive value of relevant outbound links, and the like”
(Eric Enge 6 ways to sculpt your Site link equity)
Measuring precisely link equity is difficult but a very very rough calculation is dividing a page’s toolbar Pagerank between the number of links (internal & outbound) on that page. SEOmoz’s Linkscape measures the raw Mozrank passed in a similar way. I am hoping this will be tweaked and updated in the future versions of Open Site Explorer . Once you download the csv from linkscape you end up with the raw figure. Adam Feldstein, VP Product at SEOMoz explained how you convert this figure back to the cleaner version:
They are in fact the same values, but for some reason the raw number is showing in the .csv file vs. the ‘pretty’ value that you see in the online report. If you want to convert the raw value to the pretty value in the spreadsheet, you can use the following formula.
(m * ln(raw)) + b
Raw is the raw value, and m and b are constants. The current m and b constants for URL mozRank are 0.385005057627582 and 12.7260317160508, respectively. These constants change just a little bit from index to index, but not generally in any significant way.
You could also do your own calculations using Open Site Explorer. Take the Page Authority (PA) figure and divide it my the number of links on the page. You need to calculate the total number of links on the page, both internal and external (try this tool).
MajesticSEO uses ‘AC rank’ to order their links, though this is only a rough measure of the page strength where your link sits you could still use the above approach to get the AC per link.
It’s also important to note that some links will not be passing any equity from the start, you can see if they are working by running link tests. See my Q&A inspired post by Rand Fishkin here. The problem is measuring old links, its simply not viable to test every link in your profile.
How does link equity decay?
Links become broken
As discussed above, pages are lost from the web each day. Webmasters continually tweak websites and make changes, which can often cause broken links, pages where your link sat can simply be deleted or moved creating 404 pages.
I tried to find the percentage of broken links on the web, but couldn’t find this figure anywhere so asked Dr Pete for the % of 404s in Linkscapes last crawled.
Out of 416,475,264,279 links crawled I asked how many were broken:
“9.155% of all URLs in our last index were 404’d specifically. 10.217% of URLs had an error code of some kind (4xx, 5xx, 7xx, etc.) ”
Algorithm Changes & External Factors
Google makes between 350 to 550 algorithm changes a year to improve search quality and reduce spam. The majority of the changes will be undetectable, but links are continually being devalued. Over time many types of links have been devalued such as directory links, reciprocal links and footer links. Google also continually improves its ability to spot and devalue paid links.
If you are wondering why you have had a big rankings drop, the chances are it is not a penalty…you have just had links discounted or devalued.
Along with algorithm changes, Google is constantly finding link networks and sites that are created solely for link building. Links are continually being discounted on site by site basis directly and indirectly, would a link from a Panda hit site now pass less link equity?
Site Changes
Link equity decay can occur when sites have makeovers and are updated, the site/page redesign may add more internal links to the page for example.
Along with pages being redesigned, link equity decay can happen through added outbound links to a page. You see this with directory links, every time a new listing is added, link juice is taken away from your link. Is it worth paying XXX amount when your link is on a page with a hundred others? Out of Linkscapes last web crawl the average page has 60.88 links on it, including interal pages which will still take link equity away from your link.
Poor website architecture can also cause your link juice to decay. You see this with WordPress posts as they get pushed off the home page from the standard 10 posts, only crawlable via next/previous links.
Webmaster/Site Manipulation
Webmasters can easily manipulate your link to stop it passing any value, example being Linkedin profiles now sending external links through a redirect. There are many ways besides just adding ‘Nofollow’ to block your link’s value, the entire page may later be blocked by the webmaster – a sneaky trick but this still happens. I’ve seen even more cunning link scams like reciprocal link cloaking. You get an email asking for a link exchange and they tell you your link has been added. You click on the url provided which then causes a cookie drop or your ip is logged and your website ‘appears’ in their sidebar (nice 🙂 ), as discussed above, make sure the link is passing equity from the offset!!
Webmaster Greed

Links being added to a page that are off topic will lower the trust metrics of the page/domain. You will see this a lot if you are in the market for buying links. Webmasters will add links to pharmacy, debt and sex sites if the price is right. A good example is seen with the $5000 Wikipedia links, not only is your link on a page with 182 other sites but the page links out to casino and drug sites. Do you really want your link to be on that page?
Link Buyers Stupidity
Many webmasters who buy/rent links will let their link expire through lack of organisation. As you scale your link building you will need sytems in place to manage these link placements.
Rand argued that aged links do not add value, but I tend to disagree. If your link is removed and the page is cached, then your link appears next time the page is cached isn’t that an obvious signal? It may not have a direct impact but you have left a footprint.
Ways to stop link equity loss
1) Monitor your best links closely, especially if you have paid for them. Here is a script I just got built for the job. The script will check that your links are still live, just upload a spreadsheet with all your links that you want to track. It’s also pretty handy if you are doing a few reciprocal niche link exchanges with some webmasters you don’t entirly trust.
SEO Doctor’s Link Tracking Script
To find out how to run, just go to my quick installation guide.
2) Use a Link Atrophy Tool – this one is good and powered by Linkscape, though you will have to wait for the update.
You can also discover your dropped links using MajesticSEO. I have just tried it out on Beatthatquotes dropped links. Let’s see what links Google removed from this sites link profile > removed links.
3) Control your own links. Creating a powerful network of sites is expensive and time consuming. I have seen SEOs cower at the term ‘network’ and associate them with badly thought out link networks by link brokers, but it doesn’t have to be this way. There is nothing wrong with having a network of sites, just make sure you do right. If you don’t have time, just pay Fantomaster 500, 000 Euros and he’ll sort you out!
4) Keep linking to your good guest posts, especially if on authority domains. A link on a strong domain will work better if you build links to the page.
6)Monitor OBLs and bad neighbours, especially if you are ‘renting links’. Just keep checking the pages from time to time and make sure the webmaster is keeping his link sales on topic. Remove your link when you start seeing and pharamacy, escort, casino or debt consolidation links. Try this Bad Neighborhood Checker:
7) Monitor the content quality of your link placement sites. Often sites change hands and new webmasters may start publishing duplicate content which will lower the trust metrics of the site and the link equity passed.
8 ) Monitor the rate of link sales. Before you buy a link on a page, check the rate new links are added to that page and remember each one added will dilute your link equity.
9) Track changes to the Mozrank passed (mR passed) on the bulk of your quality links. I haven’t tried this yet, but it may be possible with the SEOmoz api.
10) Keep on top of Google’s algorithm changes, especially the ones that may devalue your links. Also find out what industry leaders consider as link devaluation metrics.
11) As social metrics are now a ranking signal, I’m sure a link on a highly shared page will have more value. The key is to find links on pages that are likely to be continually shared such as good resource pages, they will also be more likely to attract links over time.
Every professional link builder should think about link equity decay, especially if they are tracking link velocity closely against competitors. If you have any thoughts or ideas on this topic let me know in the comments below.
Update: Since I started writing this post SEOmoz has totally turned off Linkscape and have not included any mozrank passed metrics into OSE, I’m hoping they have something up their sleeve that will be rolled out soon.Link
Link Rot Example
Link rot can be seen by comparing Majestics to data bases: The historic index v Fresh index (90days). Take a look at this post for example:
Facebook’s Graph Search = The Ultimate Online Dating Service?
Majestic Historical Index
Majestic Fresh Index
This equates to 1,943 lost linking domains over 3 years!
This is a solid blog post with some great resources in this post, thanks for sharing.
Rarely do I find an article that gets my attention. Bravo mate! Great article. I believe this is the first legit piece on link decay. I’d love to connect sometime over some virtual coffee.
Cheers,
Joe
Thanks Joe Kind words!
It seems that paid links now have no value – even on the relevant directories where those links offer true value to the user experience. Where we pay we really have to justify that cost and monitor it constantly as the search engine traffic far outweighs that from “natural” links. It is very difficult to assess the impact of this devaluation.
This is just a dead-horse topic! The whole point of Google is to try and balance a market. You have to have links to rank. No matter how you want to look at it, whereever there is demand there will be competition. Naturally, SEO companies are going to offer solutions to rank. Google cannot control what supply and demand wants. They can try all day long to bust spam, but that very demand to rank and target customers is why Google is even important. If Google continues to devalue links to Oblivion, they won’t be the center of anything. Imagine a world where there is too much opression, why they hell would any web-master waste their money or time to a platform that negates their ability to compete. Links are needed to manipulate Google’s index. Link decay is bound to happen for a number of reasons, however, there are many other factors that play into effect here. I have used Yahoo site exploreer and studied many websites and the amount of links they have amassed to their website. Amazon has some 210,000, 000 million links reported by Yahoo’s site explorer. I studied some of their links and found them to be on sites, that specifically existed to push link juice, so they could rank for those targeted keywords. Anyone see a paradity here? At the end of the day, I would not worry about negatives, they only exist to get rid of true spammers and shit webmasters with no real information or products for consumers.
Thanks for taking the time to comment Chad!
Hey there Gareth, evidently SEOmoz liked your post well enough to link to it in the newsletter I just received 🙂 Congrats! Great post btw, plenty of food for thought re paying more attention to current back-links.
bit.ly/qOZOOD item 8
haha yes I just saw that in my email newsletter this morning…it was worth the time after all!
Interesting informations on link decay. Thanks
Yip – some fantastic resources I hadn’t seen before. Thanks Gareth. And one fabulously appropriate typo: “Rank Fishkin”. Nice.
lol good catch… I was testing to make sure you were paying attention 🙂
An interesting read – great to see someone raising the subject of the importance of looking after links! Agencies and in-house teams alike have a responsibility to ensure that the time, effort and investment dedicated to link building is preserved in the ways you’ve described. NB: Buzzstream is a great tool for link management & can be scheduled to run checks on back-links.
Thanks Amy, I have not played with Buzzstream before its now on the to-do list. Also got I nice tip from @Wiep http://www.changedetection.com/ > it monitors a page for any changes and sends you an email alert if the page’s text changes at all.
Hi,
thank your for this information. I am tracking links with raventools for the mayor projects. For analysis, if i want go for a link on a specific page i use the SeoMoz tools, like the OSE you mentioned here.
I found them useful, but missing some “link value” decrease or increase status reporting. I will figure now out, how i can perform it manually by your tips.
Greetings from Hamburg, Germany.
Thanks Braumueller for dropping by! I asked Rand if the Mozrank passed metric will be added to OSE and the answer is NO.
“… I think raw mozRank or other PageRank-like link metrics are getting less and less valuable/important for SEO, as the engines get much more sophisticated in the signals they use for rankings and crawling. That said, it’s still good stuff to consider, particularly for large sites that have indexation problems. Re: mozRank passed – we haven’t found it to be a very valuable metric (and very few of our members used it), but we might be adding it if we get lots of requests (there’s so many things folks want and our list of to-dos is a mile long) 🙂 “
Correction: your link to Wikipedia’s corporate sustaining donors does not appear to have even in the neighbourhood of 182 links, much less links to casino or pharmaceutical sites.
182 including internal links that is, and yer not loads of links to casino etc but there are links. Would you pay for a link on that page?
I’ve tried using this link checking tool, but I can’t get passed the setup page. When I enter a login and password on that setup page, it just redirects me back to the same page. It won’t take me to the login page. Did I do something wrong?
It should kick into a dashboard once you log in… try re-installing again!
Nice post Gareth, Found this article from seomoz email news letter. I always wonder about losing your most powerful links.