What are the Marketing Costs of a DDoS attack?

chris dyson & SEO Doctor This Post is Written by Chris Dyson. Chris is an SEO Consultant for Hit Reach and blogs infrequently at TripleSEO.com

 

It seems like a month doesn’t go by without a story popping up on some tech news blog regarding another major DDoS attack.Only this past few days has Moz been suffering from a DDoS attack on their site.

The term “DDoS” stands for “Distributed Denial of Service” and it’s an attack that is commonly used by hackers to bring down a website temporarily. While this hack doesn’t really help anyone gain access to specific information on its own, it is a very useful tool for making a website unreachable by its intended audience. These attacks have been around for quite some time, but they were more recently popularized by those perpetrated by Anonymous against various large companies.

The Original Denial of Service

DDOS SEO Doctor

The way these attacks work is fairly simple. Websites servers are only able to provide access to a certain number of users at any given time due to bandwidth and other considerations. When too many people try to access the site on the server at once, the server cannot handle the load being placed on it and the site becomes unavailable. People use specialized software programs to access a website from many different IP addresses at once. When many people get together to target the same website at the same time, this can overload the host server and make the site impossible to reach. Some of these tools are made up of proprietary software, but some are open source to the public. For example, Anonymous has freely distributed the “Low Orbit Ion Cannon” software for free public use in DDoS attacks.

In the end, there’s a lot of argument over whether all DDoS attacks are pure vandalism or if some fall under the blanket of Hacktivism. After all, many of the most famous DDoS attacks did specifically target certain companies and organizations for very specific reasons. While that debate rages on, people will still continue to perform these attacks on websites.

The Impact to Your Website

DDOS 2

DDoS attacks can hurt your revenues but there are of course other concerns too:

1. Brand and customer perception – inflicting potential brand damage at the same time as granting a competitive advantage to your rivals.

2. Email and contact centers – once network infrastructure and routers are targeted, DDoS attacks might bring down email and client contact centers, particularly if the call center is on a voice-over-IP (VoIP) network. During this incident, a DDoS attack can interrupt communication with customers, partners, vendors and even staff.

3. Stock price and market confidence – Some organisations hit by DDoS attacks have seen stock prices briefly fall and/or suffer volatile fluctuations because of market concerns for large brands this can be significantly expensive.

4. Search engine rankings – one negative consequence from DDoS attacks that usually gets overlooked is the potential affect it might have on rankings. We already know that if your web site isn’t accessible or crawlable, it could hurt your rankings. After all, Google needs to serve its users with quality results and websites that work. So, once your web site is down from a DDoS attack and Google sees that it’s “uncrawlable” it’s fair to assume that your rankings could take some form of impact.

It’s important to point out that the length of time your site is down plays a role in determining whether or not your rankings will be affected. According to everyone’s favourite Googler Matt Cutts, “If it [your site being down] was just for a day, you should be in pretty good shape. If your host is down for two weeks, then there’s a better indicator that the website is actually down and we don’t want to send users to a website that is actually down. So if it was only just a short period of downtime, I wouldn’t really worry about that [affecting your rankings].”

DDoS attacks are very often incorrectly associated with a service outage. In fact the biggest impact of DDoS/DoS attacks in 2013 was service degradation, which in most cases presents itself as a slow website.

A recent study by TRAC Research of 300 businesses, reported three very interesting things:

• Average revenue losses of $21k per hour of downtime.

• Average revenue losses of $4k per hour of performance slowdown.

• Website slow-downs occur up to ten times more frequently than website outages.

In other words, website slowdowns, can have a greater impact over time on your revenues as outages. While temporary outages cost more per minute, slowdowns take up significantly more time and can ultimately cost more.

And what about the impact on customer retention:

During a temporary outage, 9% of customers will permanently abandon your website.

• When your website is unacceptably slow; 28% of customers will permanently abandon it.

Or to put it another way: the permanent abandonment rate for a slow site is more than three times greater than the abandonment rate for a site that is temporarily down. Think about that for a minute.

Unfortunately, some black hat SEOs have already started using DDoS attacks against competitors as a tactic to damage their sales and rankings.

How to Identify a DDos Attack

Most cloud DDoS mitigation services are available on demand which means that customers will enable the service once they are the victim of a DDoS attack.

There are multiple monitoring tools you’ll be able to use to capture traffic changes, which might help confirm whether or not you’re under a DDoS attack.

#1. Internal server, network and infrastructure monitoring applications

Companies have plenty of monitoring package and applications to decide on from, however one of the morestandard pieces of software, referred to as Nagios, allows you to monitor internal infrastructure status and performance of applications, services, operational systems, network protocols, system metrics and network infrastructure.

For example, monitoring software packages will check your HTTP service to make sure that a web site or server is functioning properly, and if the service isn’t functioning, most software includes real-time notification. As a result most DDoS attacks target a web server or application server, monitoring software could show the HTTP service to be experiencing a problem with slowness, high CPU utilization or complete failure. While watching servers and infrastructure are useful, there’s no guarantee that DDoS is the issue. Abnormal spikes in traffic and usage do occur for legitimate reasons from time to time.

#2. External Performance watching Solutions within the Cloud

Companies that don’t host their own websites and use services like Amazon EC2 would get the most from third-party solutions. Network/infrastructure tools — that are sometimes put in within your network — external performance watching solutions are usually provided by a 3rd party and leverage various monitoring locations from round the world.

External watching tools will study many elements:

• Virtual browsers to check the basic web site over time and performance

• Real browsers to look for site / application performance, errors and repair degradation

• Network services like DNS, FTP and your email, among others

From a DDoS perspective, associated external third-party monitoring is the sensible answer. The aim of this kind of tool is to perpetually monitor an internet site, service or application and check for potential indicators of a DDoS attack.

That said, though a 3rd party external tool will work on capturing DDoS attacks, these solutions don’t seem to be foolproof. While external tools will point out that site performance is degrading or is failing, it cannot confirm the explanation.

Originally, the goal of third-party tools was to make sure that ISPs, hosting and servers were functioning as designed. Slow response times and outages may indicate a hosting provider or server being down.

How to Test your Sites Vulnerability

If you are a competent technical person and have a few friends you could try LOIC (Low Orbit Ion Cannon) or HOIC (High Orbit Ion Cannon) which were released to the public by Anonymous, but of course you could just spend $5 on Fiverr and get a report carried out for you to highlight the most common vulnerabilities. Regular DDOS attacks like those launched by LOIC work by overwhelming the server with complete requests. Slowloris works differently. It opens connections to the server but never actually completes them which ties up all the server’s sockets resulting in a DOS. The easiest way to mitigate this sort of attack is by denying many connections from a single client. But of course there’s a way to get around this using Tor and Pyloris a Python version of Slowloris.

hping is a command-line oriented TCP/IP packet assembler/analyzer. Hping is one of the de-facto tools for security auditing and testing of firewalls and networks.

Of course SEO’s have made the ability to carry out a DDoS attack really easy for just about anyone. Russ Jones wrote an article that showed that free SEO tools that create XML sitemaps or crawl sites could be utilised all at the same time to cause a significant increase in bandwidth and CPU Usage.

You should do whatever you feasibly can to prevent DDoS attacks or at the very least mitigate against any attack that does hit your website. Make sure that you have systems in place to detect potential attacks and have a plan in place for responding to them as quickly and effectively as possible.

IP Delivery and Geo Targeting by SebastianX

This is a guest post from the elusive SebastianX , you can read more of his rants on his excellent blog Sebastian’s Pamphlets.

So Gareth James asked me to blather about the role of IP delivery in geo targeting. I answered “That’s a complex topic with gazillions of ‘depends’ lacking the potential of getting handled with a panacea”, and thought he’d just bugger off before I’ve to write a book published on his pathetic UK SEO blog. Unfortunately, it didn’t work according to plan A. This @seo_doctor dude is as persistent as a blowfly attacking a huge horse dump. He dared to reply “lol thats why I asked you!”. OMFG! Usually I throw insults at folks starting a sentence with “lol”, and I don’t communicate with native speakers who niggardly shorten “that’s” to “thats” and don’t capitalize any letter except off “I” for egomaniac purposes.

However, I didn’t annoy the Interwebz with a pamphlet for (perceived) ages, and the topic doesn’t exactly lacks controversial discussion, so read on. By the way, Gareth James is a decent guy. I’m just not fair making fun out of his interesting question for the sake of a somewhat funny opening. (That’s why you’ve read this pamphlet on his SEO blog earlier.)

How to increase your bounce rate and get your site tanked on search engine result pages with IP delivery in geo targeting

A sure fire way to make me use my browser’s back button is any sort of redirect based on my current latitude and longitude. If you try it, you can measure my blood pressure in comparision to an altitude some light-years above mother earth’s ground. You’ve seriously fucked up my surfing experience, therefore you’re blacklisted back to the stone age, and even a few stones farther just to make sure your shitty Internet outlet can’t make it to my browser’s rendering engine any more. Also, I’ll report your crappy attempt to make me sick of you to all major search engines for deceptive cloaking. Don’t screw red crabs. Related protip: Treat your visitors with due respect.

Geo targeted ads are annoying enough. When I’m in a Swiss airport’s transit area reading an article on any US news site about the congress’ latest fuck-up in foreign policy, most probably it’s not your best idea to plaster my cell phone’s limited screen real estate with ads recommending Zurich’s hottest brothel that offers a flat rate as low as 500 ‘fränkli’ (SFR) per night. It makes no sense to make me horny minutes before I enter a plane where I can’t smoke for fucking eight+ hours!

Then if you’re the popular search engine that in its almighty wisdom decides that I’ve to seek a reservation Web form of Boston’s best whorehouse for 10am local time (that’s ETA Logan + 2 hours) via google.ch in french language, you’re totally screwed. In other words, because it’s not Google, I go search for it at Bing. (The “goto Google.com” thingy is not exactly reliable, and a totally obsolete detour when I come by with a google.com cookie.)

The same goes for a popular shopping site that redirects me to its Swiss outlet based on my location, although I want to order a book to be delivered to the United States. I’ll place my order elsewhere.

Got it? It’s perfectly fine with me to ask “Do you want to visit our Swiss site? Click here for its version in French, German, Italian or English language”. Just do not force me to view crap I can’t read and didn’t expect to see when I clicked a link!

Regardless whether you redirect me server sided using a questionable ip2location lookup, or client sided evaluating the location I carelessly opened up to your HTML5 based code, you’re doomed coz I’m pissed. I’ve just increased your bounce rate in lightning speed, and trust me that’s not just yours truly alone who tells click tracking search engines that your site is scum.

How to fuck up your geo targeting with IP delivery, SEO-wise

Of course there’s no bullet proof way to obtain a visitor’s actual location based on the HTTP request’s IP address. Also, if the visitor is a search engine crawler, it requests your stuff from Mountain View, Redmond, or an undisclosed location in China, Russia, or some dubious banana republic. I bet that as a US based Internet marketer offering local services accross all states you can’t serve a meaningful ad targeting Berlin, Paris, Moscow or Canton. Not that Ms Googlebot appreciates cloaked content tailored for folks residing at 1600 Amphitheatre Parkway, by the way.

There’s nothing wrong with delivering a cialis™ or viagra® dealer’s sales pitch to search engine users from a throwaway domain that appeared on a [how to enhance my sexual performance] SERP for undisclosable reasons, but you really shouldn’t do that (or something similar) from your bread and butter site.

When you’ve content in different languages and/or you’re targeting different countries, regions, or whatever, you shall link that content together by language and geographical targets, providing prominent but not obfuscating links to other areas of your site (or local domains) for visitors who –indicated by browser language settings, search terms taken from the query string of the referring page, detected (well, guessed) location, or other available signals– might be interested in these versions. You can and should group those site areas by sitemaps as well as reasonable internal linkage, and use other techniques that distribute link love to each localized version.

Thou shalt not serve more than one version of localized content under one URI! If you can’t resist, you’ll piss off your visitors and you’ll ask for troubles with search engines. This golden rule applies to IP delivery as well as to any other method that redirects users without explicit agreement. Don’t rely on cookies and such to determine the user’s preferred region or language, always provide visible alternatives when you serve localized content based on previously collected user decisions.

But …

Of course there are exceptions to this rule. For example it’s not exactly recommended to provide content featuring freedom of assembly and expression in fascist countries like Iran, Russia or China, and bare boobs as well as Web analytics or Facebook ‘like’ buttons can get you into deep shit in countries like Germany, where last century nazis make the Internat laws. So sometimes, IP delivery is the way to go.

Link Decay And Link Equity Preservation

The majority of link building posts I see focus on acquiring new links, but what about your existing links?  Once you have a strong link profile, this needs to be monitored and consolidated.  You don’t want your best links to drop off.  This post looks at ‘link decay’ also know as ‘web decay and ‘link rot’.

What is link decay

‘Web decay’ in information retrieval science refers to the loss of pages from the Internet.  Back in 2006 Bill Slawski blogged about a patent that looked at link decay, where web pages were found to disappear from the web at a rate of 0.5% per week.

Link Decay

Worldwidewebsize calculates total pages on the Internet at 19.64 Billion, while SEOmoz say the average page has 13.32 external links on it.  By my calculations that’s 1.3 Billion links being wiped out each week!

[Read more…]

Conversion Rates: To Pay, Or Not To Pay?

This is a guest post by Ben Morel, who is the search engine marketing specialist at a Surrey digital marketing agency. He has a background in marketing, offshore finance and radio astronomy.

———————————————————————————————

While Gareth and I were deciding on the subject of this guest post, we ended up discussing the differences between paid and natural search. Paid search has often been maligned: many of my clients have had bad experiences in the past with agencies who have sought to look at the number of people coming to the site. On the other hand, at the agency I work for we often recommend paid search to new clients as the easiest way of bringing new customers to the site. This has been made that bit more topical over on SEOMoz where the old question of “SEO vs PPC: which would win in a fight?” has recently been discussed. So, Gareth and I decided it could be interesting to look at site engagement and conversion rates for paid and non-paid search.

At the agency we have several clients for whom we do both PPC and SEO, from a variety of industries including:

  • An online kitchen appliance store
  • A New Zealand wine merchant
  • An interior design and hardware showroom
  • An overseas volunteering charity
  • An outdoor e-commerce store
  • A chain of tile showrooms.

The data for the above has been amalgamated over a six month period and analysed, and I shall try to explore what it means.

PPC vs SEO: The Rivalry

 

First of all the usual arguments for and against each. These are arguments that we certainly present to new clients. I won’t do a table of pros and cons because my opinion is that things are a bit more subtle than that

PPC Starts Quickly, SEO Takes Time To Build

For a business wanting to get off the mark today, this is definitely a pro for PPC. In the long term, however, the build up for SEO allows testing of on-page elements and application of conversion rate optimisation on a small audience so that when it finally starts pulling in traffic that traffic converts well. Of course you can also keep your PPC budget low while testing on-page factors before ramping it up to keep ROI high, but then you lose some of that initial acceleration of sales. So which is better? It depends: on a brand new site I would say PPC. On a mature site, though, both can be used to meet different objectives.

PPC Costs Every Time Someone Clicks, SEO Costs Are Initiative-Based

This means that it’s easy to count the cost of PPC: just take the amount you spent in a month and add the amount spent on your agency or in-house team. Calculating ROI then follows on smoothly, as long as your website tracking is set up properly. With SEO are a bit more difficult to track. We can track costs to build a certain number of links in the month, keep social media up to date, create articles for distribution and all the other things that take up our time as SEOs. But measuring ROI from each is a lot more difficult. So which is better? Well, in some ways PPC – you know your costs and returns immediately. But links will keep on giving juice, and visitors, for months to come, so ROI for a certain amount of resource spent on SEO keeps going up.

You Can Share Results between the Two

As Danny Dover comments on the SEOMoz Whiteboard Friday (see below), PPC best practices are very different to SEO best practices. PPC landing page tests generate huge amounts of duplicate content and cannibalise keywords from other pages, which is exactly the opposite of what you want in SEO. However, the results from those landing page tests can be very useful for SEOs as well, so both teams can derive benefits. Also, rel=canonical tags can be used to minimise overlap.

At least as important are the SEO and CRO benefits of ad testing. Now, you may think I’m slightly daft for saying this. But: the aim of ads is to entice people to your site as often as possible and get them engaged. The aim of meta titles and even the headlines on your landing pages are exactly the same, so if a PPC ad has a higher CTR for the headline “Blue Widgets 30% Off” than for “30% Off Blue Widgets” you can almost guarantee that using the text this way round for the page headline will result in improved ROI.

Engagement and ROI

 

Now that the question of which to implement is solved, that is campaigns for both if you can, what about results?

The New Zealand wine merchant makes a good case study. Over the past six months we have seen that for non-branded traffic natural search has a conversion rate 60% higher than paid search. However, the aims of this client are long term – getting the best lifetime customer value they can. For that they have their email list and both segments have around a 2.70% sign-up rate for the monthly newsletter, meaning that LCV will be very close in a couple of years time.

With the tile showroom, things are even more interesting. As you may expect, local search plays an important part in their campaigns and this importance is especially high for paid search, where local terms have a conversion rate four times greater than the rest of the PPC account. While conversion rates are again lower for PPC overall, conversion rates for local keywords are the same for paid and non-paid search.

So how do clients fair overall? The table below gives an impression of success by looking at conversion rates and using bounce rate as a measure of engagement. We have also included Google Product Search results out of interest and excluded keywords containing the clients’ brands.

Medium Visits Bounce Rate Conversion Rate Per Visit Value
PPC 157009 31.43% 3.04% £2.98
Natural 92096 51.04% 4.84% £4.15
Products 15762 57.03% 7.18% £1.96

 

As you can see, PPC advertising does have a place: engagement is higher than with other media, even if conversion rate is the lowest of the three. These PPC accounts do seem profitable, which can never be a bad thing, but they evidently need refinement.

This is perhaps the crux of the problem with PPC: although an account can be set up and set live in a day, it takes much longer to refine keyword lists, test ads and improve landing pages. In getting them up to speed, then, PPC and SEO are in many ways on a level footing and as the first section of this post postulates which is better depends on your short- and long-term goals, budget and how to please the person who pays you.

SEOmoz – SEO Software