Latest posts by Gareth (see all)
- What are the Marketing Costs of a DDoS attack? - May 20, 2014
- Blackhat CRO: The Dark Side of Conversion Rate Optimisation - April 18, 2013
- Interflora Google Penalty – A quick deep dive - February 22, 2013
The majority of link building posts I see focus on acquiring new links, but what about your existing links? Once you have a strong link profile, this needs to be monitored and consolidated. You don’t want your best links to drop off. This post looks at ‘link decay’ also know as ‘web decay and ‘link rot’.
What is link decay
‘Web decay’ in information retrieval science refers to the loss of pages from the Internet. Back in 2006 Bill Slawski blogged about a patent that looked at link decay, where web pages were found to disappear from the web at a rate of 0.5% per week.
Worldwidewebsize calculates total pages on the Internet at 19.64 Billion, while SEOmoz say the average page has 13.32 external links on it. By my calculations that’s 1.3 Billion links being wiped out each week!
Here I want to split the terms. Web/link decay being the loss of pages from the Internet, but ‘link equity decay’ to mean the deterioration of link equity over time or the link being totally lost.
Link Equity, measuring the juice
“Link equity is the concept of the influence of links on a pages ability to rank for particular search queries. Link equity takes into account things like relevance, authority and trust, link placement and accessibility, the positive value of relevant outbound links, and the like”
(Eric Enge 6 ways to sculpt your Site link equity)
Measuring precisely link equity is difficult but a very very rough calculation is dividing a page’s toolbar Pagerank between the number of links (internal & outbound) on that page. SEOmoz’s Linkscape measures the raw Mozrank passed in a similar way. I am hoping this will be tweaked and updated in the future versions of Open Site Explorer . Once you download the csv from linkscape you end up with the raw figure. Adam Feldstein, VP Product at SEOMoz explained how you convert this figure back to the cleaner version:
They are in fact the same values, but for some reason the raw number is showing in the .csv file vs. the ‘pretty’ value that you see in the online report. If you want to convert the raw value to the pretty value in the spreadsheet, you can use the following formula.
(m * ln(raw)) + b
Raw is the raw value, and m and b are constants. The current m and b constants for URL mozRank are 0.385005057627582 and 12.7260317160508, respectively. These constants change just a little bit from index to index, but not generally in any significant way.
You could also do your own calculations using Open Site Explorer. Take the Page Authority (PA) figure and divide it my the number of links on the page. You need to calculate the total number of links on the page, both internal and external (try this tool).
MajesticSEO uses ‘AC rank’ to order their links, though this is only a rough measure of the page strength where your link sits you could still use the above approach to get the AC per link.
It’s also important to note that some links will not be passing any equity from the start, you can see if they are working by running link tests. See my Q&A inspired post by Rand Fishkin here. The problem is measuring old links, its simply not viable to test every link in your profile.
How does link equity decay?
Links become broken
As discussed above, pages are lost from the web each day. Webmasters continually tweak websites and make changes, which can often cause broken links, pages where your link sat can simply be deleted or moved creating 404 pages.
I tried to find the percentage of broken links on the web, but couldn’t find this figure anywhere so asked Dr Pete for the % of 404s in Linkscapes last crawled.
Out of 416,475,264,279 links crawled I asked how many were broken:
“9.155% of all URLs in our last index were 404’d specifically. 10.217% of URLs had an error code of some kind (4xx, 5xx, 7xx, etc.) ”
Algorithm Changes & External Factors
Google makes between 350 to 550 algorithm changes a year to improve search quality and reduce spam. The majority of the changes will be undetectable, but links are continually being devalued. Over time many types of links have been devalued such as directory links, reciprocal links and footer links. Google also continually improves its ability to spot and devalue paid links.
If you are wondering why you have had a big rankings drop, the chances are it is not a penalty…you have just had links discounted or devalued.
Along with algorithm changes, Google is constantly finding link networks and sites that are created solely for link building. Links are continually being discounted on site by site basis directly and indirectly, would a link from a Panda hit site now pass less link equity?
Link equity decay can occur when sites have makeovers and are updated, the site/page redesign may add more internal links to the page for example.
Along with pages being redesigned, link equity decay can happen through added outbound links to a page. You see this with directory links, every time a new listing is added, link juice is taken away from your link. Is it worth paying XXX amount when your link is on a page with a hundred others? Out of Linkscapes last web crawl the average page has 60.88 links on it, including interal pages which will still take link equity away from your link.
Poor website architecture can also cause your link juice to decay. You see this with WordPress posts as they get pushed off the home page from the standard 10 posts, only crawlable via next/previous links.
Webmasters can easily manipulate your link to stop it passing any value, example being Linkedin profiles now sending external links through a redirect. There are many ways besides just adding ‘Nofollow’ to block your link’s value, the entire page may later be blocked by the webmaster – a sneaky trick but this still happens. I’ve seen even more cunning link scams like reciprocal link cloaking. You get an email asking for a link exchange and they tell you your link has been added. You click on the url provided which then causes a cookie drop or your ip is logged and your website ‘appears’ in their sidebar (nice 🙂 ), as discussed above, make sure the link is passing equity from the offset!!
Links being added to a page that are off topic will lower the trust metrics of the page/domain. You will see this a lot if you are in the market for buying links. Webmasters will add links to pharmacy, debt and sex sites if the price is right. A good example is seen with the $5000 Wikipedia links, not only is your link on a page with 182 other sites but the page links out to casino and drug sites. Do you really want your link to be on that page?
Link Buyers Stupidity
Many webmasters who buy/rent links will let their link expire through lack of organisation. As you scale your link building you will need sytems in place to manage these link placements.
Rand argued that aged links do not add value, but I tend to disagree. If your link is removed and the page is cached, then your link appears next time the page is cached isn’t that an obvious signal? It may not have a direct impact but you have left a footprint.
Ways to stop link equity loss
1) Monitor your best links closely, especially if you have paid for them. Here is a script I just got built for the job. The script will check that your links are still live, just upload a spreadsheet with all your links that you want to track. It’s also pretty handy if you are doing a few reciprocal niche link exchanges with some webmasters you don’t entirly trust.
To find out how to run, just go to my quick installation guide.
2) Use a Link Atrophy Tool – this one is good and powered by Linkscape, though you will have to wait for the update.
You can also discover your dropped links using MajesticSEO. I have just tried it out on Beatthatquotes dropped links. Let’s see what links Google removed from this sites link profile > removed links.
3) Control your own links. Creating a powerful network of sites is expensive and time consuming. I have seen SEOs cower at the term ‘network’ and associate them with badly thought out link networks by link brokers, but it doesn’t have to be this way. There is nothing wrong with having a network of sites, just make sure you do right. If you don’t have time, just pay Fantomaster 500, 000 Euros and he’ll sort you out!
4) Keep linking to your good guest posts, especially if on authority domains. A link on a strong domain will work better if you build links to the page.
6)Monitor OBLs and bad neighbours, especially if you are ‘renting links’. Just keep checking the pages from time to time and make sure the webmaster is keeping his link sales on topic. Remove your link when you start seeing and pharamacy, escort, casino or debt consolidation links. Try this Bad Neighborhood Checker:
7) Monitor the content quality of your link placement sites. Often sites change hands and new webmasters may start publishing duplicate content which will lower the trust metrics of the site and the link equity passed.
8 ) Monitor the rate of link sales. Before you buy a link on a page, check the rate new links are added to that page and remember each one added will dilute your link equity.
9) Track changes to the Mozrank passed (mR passed) on the bulk of your quality links. I haven’t tried this yet, but it may be possible with the SEOmoz api.
10) Keep on top of Google’s algorithm changes, especially the ones that may devalue your links. Also find out what industry leaders consider as link devaluation metrics.
11) As social metrics are now a ranking signal, I’m sure a link on a highly shared page will have more value. The key is to find links on pages that are likely to be continually shared such as good resource pages, they will also be more likely to attract links over time.
Every professional link builder should think about link equity decay, especially if they are tracking link velocity closely against competitors. If you have any thoughts or ideas on this topic let me know in the comments below.
Update: Since I started writing this post SEOmoz has totally turned off Linkscape and have not included any mozrank passed metrics into OSE, I’m hoping they have something up their sleeve that will be rolled out soon.