Hidden Algorithm Flaws Expose Websites to DoS Attacks

This week, the notorious 8chan went down after its infrastructure provider Cloudflare withdrew services over the forum’s radical, violence-promoting content. Cloudflare didn’t shut the site down directly, but by removing its protection against distributed denial of service attacks, it could all but guarantee that the forum would crash. But while classic DDoS attacks, which overwhelm a site with junk traffic, have persisted and evolved across the web, researchers are warning about a new spinoff: subtle attacks that target not server capacity, but algorithms.

Many websites and services rely on algorithms to transform data inputs into actions and results. But new research detailed Thursday at the Black Hat cybersecurity conference in Las Vegas shows how a small, seemingly innocuous input for an algorithm can cause it to do a huge amount of work—slowing a service down or crashing it entirely in the process, all with just a few bytes.

Lily Hay Newman covers information security, digital privacy, and hacking for WIRED.

Nathan Hauke and David Renardy of the security firm Two Six Labs started looking for these “algorithmic complexity” issues in mainstream services, and quickly found them in PDF readers, remote desktop servers, and a popular password strength evaluation tool. Their research showed that, given some carefully crafted inputs, they could bring all of those services to a halt. What’s troubling is that these vulnerabilities aren’t really software bugs that can be easily patched or fixed—they’re fundamental issues in the way algorithms are built and implemented that allow a tiny input to generate major resource drain.

“It’s a situation where developers have implemented some algorithm that has unacceptable worst-case performance,” says Renardy. “We looked at three different sets of software unrelated to one another—totally different algorithms, totally different situations—and found that they all suffer from a similar type of vulnerability.”

PDF readers represent an especially broad area of concern, because the fault lies in the PDF specification itself. PDF documents can be maliciously crafted so when a document parser analyzes a file, it can take minutes or even hours to open it. The researchers particularly looked at optical character recognition parsing libraries—used for, say, an accounting system that uploads PDF receipts and can detect expense items. The researchers also found that they could craft PDFs that eat a large amount of resources when you try to convert them into other file formats.

In practice, an attacker could use a single PDF to make an entire expense report platform or PDF processing system crash for a whole company or be unresponsive to all of that company’s users, not just the one who uploaded the malicious file. While developers can mitigate these types of vulnerabilities by modifying their implementations of the ancient PDF specification, many PDF tools and libraries sit out on the web and are no longer maintained. But the researchers say they’re working with some parser library developers to help fix their products.

“It’s attacking the implementation, attacking the design.”

David Renardy, Two Six Labs

The second place where algorithmic complexity vulnerabilities cropped up was in virtual network computing servers used for remote desktop products. Most Linux VNC servers are built on the same legacy code, and the researchers found that they could launch an algorithmic complexity attack against these servers that causes them to start generating junk data and filling up their hard drive space, creating a sort of traffic jam inside the server. The researchers notified five VNC services about the issue and at least one, TurboVNC, has already made changes to protect against these types of DoS attacks.

“It fills the disk of the machine, it exhausts its entire disk space,” notes Renardy. “You’d have to clear out these files that are getting written. And for most victims all you’d know is that you’ve lost all of your available disk space, you probably wouldn’t know why. Because it’s very much at the design level. It’s attacking the implementation, attacking the design.”

The researchers also looked at a password strength estimator developed by Dropbox and used by many other web services. The researchers say that the tool doesn’t take into account the possibility that someone would submit a large, complex password that would take a lot of time for the algorithm to analyze and assess. The researchers found that some 100-character passwords caused about six seconds of delay, and some 1,000-character passwords could cause minutes of hang time or more. This is potentially significant since the tool is embedded in other services and could be exploited to cause more global outages.

If the tool is attacked while running on the user side, it will only crash for that user. But if web services implement the tool on their server side, an attack could potentially have these larger, cascading effects. “If it’s server side, now you’re taking down access to that server for anyone,” Hauke says. “CPUs are spinning and no one else can make a new connection, so it’s denial of service to that entire service.”

A Dropbox spokesperson acknowledged the situation in a statement. “When we launched the tool, we were explicit that it should be implemented on the client side. The researchers are correct that if a service host decides to implement the tool server side, additional architectural decisions will be needed to prevent server-side DoS attacks,” the company’s statement says. “We value the partnership of security researchers and will continue to work closely with them to better the security of the entire industry.”

The researchers have been developing a public tool called ACsploit that developers can use to identify worst-case scenario inputs for their algorithms, and potentially make fundamental design changes based on this information. And they emphasize that while they looked at particular examples to illustrate the problem, algorithmic complexity vulnerabilities can show up in almost any system and need to be better understood. The researchers want developers to gain awareness of the problem so they can avoid these pitfalls, but also want to ensure that penetration testers and the broader security community are on the lookout for these potential exposures as well.

It’s not the massive DDoS attacks you may be used to, but these stealth attacks can produce the same damaging results.


More Great WIRED Stories

Read More