Even with the wide range of strategies DDoS attacks employ in their quest for devastation, they’ve earned a narrow reputation amongst casual observers of the state of cybersecurity. Thanks to major IoT botnets like Mirai and recent record-breaking memcached attacks on targets like GitHub, DDoS attacks have come to be seen as the Mike Tyson punch of cyber assaults: thundering bangers so powerful it’s no wonder many opponents can do nothing more than stagger, stumble and drop.

However, as DDoS professionals and security analysts will tell you, the attacks that are most difficult to deal with are the ones that use brains, not brawn. This is unfortunate considering the numbers are in for the fourth quarter of 2017, and the trend towards increasingly smart attacks is ramping up. Instead of knockout punch attempts, your DDoS mitigation is going to be dealing with some pretty brilliant rope-a-dope. Is it prepared for Muhammad Ali?

By the numbers

Those booming DDoS or distributed denial of service attacks made famous by internet-shaking assaults that took the likes of Reddit and Netflix offline tend to be aimed at the network layer. There isn’t anything clever about these attacks, there’s no real attempt to disguise them, it’s just a huge amount of malicious traffic barraging a victim. For an unprotected website this is assured downtime, but for any online service with decent DDoS protection these attacks are easy to detect and can therefore be easy to mitigate before they can greatly affect availability, so long as the protection appliance or service is highly scalable.

For protection that qualifies as decent but not much better, it’s a whack of bad news from Imperva whose DDoS protection division Incapsula recently published its Global DDoS Threat Landscape report for Q4 2017. The number of network layer attacks fell a full 50% from the third quarter, and making up the difference were those brainy and hard-to-stop application layer attacks, which rose 43%.

Application layer madness

Unlike their network-layer counterparts, application layer attacks look and act like legitimate requests from legitimate website users. This allows them to sneak past a great deal of protection measures meant to be looking for irregular and suspicious traffic patterns. That isn’t the extent of application layer attack craftiness either, as these small but strenuous attacks are precisely designed to put in the smallest amount of effort yet consume the maximum amount of server-side resources. Many professional attackers research their intended targets, finding the website elements that require the most work from the server – such as dynamic content that can’t be cached – and load those elements repeatedly. Application layer attacks are basically the Rumble in the Jungle, and all too often, the target server is George Foreman left lying flat on the canvas.

In that 43% increase in application layer attacks in the fourth quarter of 2017, Incapsula specifically spotted a sizable uptick in assaults that weighed in between 100 and 1000 requests per second (RPS), with over 50% of fourth quarter application layer attacks landing in that category. This points to an increase in DDoS-for-hire users taking aim with application layer assaults. They’re cheaper to launch and sustain compared to network layer attacks, leading even non-professional attackers to take the clever route when it comes to knocking out their targets.

Increasingly brilliant bots

Behind most distributed denial of service attacks are swarms of DDoS bots. For a DDoS attack to be successful on a website with some level of DDoS protection, the bots that make up malicious DDoS traffic need to go undetected by the security measures put in place to stop them. In Q4, 17% of DDoS bots were capable of doing exactly that, bypassing either cookie or JavaScript challenges. This is an increase of ten percent compared to Q3. Even more startlingly, of the 17% of bots with bypass capabilities, 16.1% had the ability to bypass both cookie and JavaScript challenges, an increase of 14.3% from Q3.

Cookie and JavaScript challenges are two of the most common security challenges used to identify DDoS traffic. Having both in place may have once been seen as strong security, but those days are long gone. At least they should be.

Making the grade

While it’s impossible to predict what the DDoS landscape has in store, there’s a good chance the professional attackers making bank on the dark web, the cybercriminals coding DDoS bots, or the malicious entrepreneurs running DDoS for hire services are going to decide that less sophisticated attacks are the wave of the future. Muhammad Ali never decided to start throwing wanton haymakers, after all.

For websites and businesses that can’t afford downtime, reputation damage, loss of user loyalty and the many other major costs and consequences of a successful attack, DDoS protection first of all needs to be a professional cloud-based managed service, and secondly needs to employ the type of granular traffic inspection that can bounce even the smartest attacks to a scrubbing server. This includes a layered approach to bot detection that includes static analysis, behavioral analysis and progressive challenges that, of course, extend beyond cookie and JavaScript challenges. Fighting the latest DDoS attacks has largely become a battle of wits, and it isn’t a battle many businesses can afford to lose.