Offensive Security

The cloud is beige - The demise of black box testing

4 minute read

Black-box penetration testing is dead. I’d question why it is even a consideration. It’s of limited and dubious value in almost any context. Wait, wait… I didn’t mean that. Put down the pitchforks and torches, development and QA teams, I’m only talking about black-box penetration testing. Yes, traditional software functional or regression testing in a black-box manner does have a purpose. Validating the functional requirements of the code has a place, and it’s valuable. Non-functional black-box testing has a place as well, when evaluating the sturdiness of an application such as load testing and the like. Settle down.

Black-box penetration testing is dead. I’d question why it is even a consideration. It’s of limited and dubious value in almost any context. Wait, wait… I didn’t mean that. Put down the pitchforks and torches, development and QA teams, I’m only talking about black-box penetration testing. Yes, traditional software functional or regression testing in a black-box manner does have a purpose. Validating the functional requirements of the code has a place, and it’s valuable. Non-functional black-box testing has a place as well, when evaluating the sturdiness of an application such as load testing and the like. Settle down.

Let’s start with defining a few differences between black-box and white-box testing. Ultimately, black-box testing emulates an attacker with no knowledge of the platform under attack. White-box testing leverages understanding of the business logic of the platform, and emulates an attack carried out by an individual with a wealth of knowledge of the platform and its associated business objectives. Here’s a quick tabular graphic, lovingly assembled by our crack marketing team just for the purpose of this article, intended to demonstrate the vast difference in approaches:*

 

Black Box

White Box

User account

No

Yes

Administrative account

No

Yes

Infrastructure architecture

No

Yes

Software architecture

No

Yes

Tool use

Extensive

Low

Architecture skills

Low

Moderate

Development skills

Low

High

Attack surface

Discovery-dependent

All

Source code

No

Yes


*- Yes, there’s a gray option in the middle there somewhere; just bear with me, I’ll get there before I’m done.

Now, I’ve been down this road before. I even wrote an article that waxed philosophical on the concept. If you do care to read that article, it should be noted that I had just watched a documentary on Bob Ross so I was inspired to write the whole thing around painting analogies. So, what got me riled up about this topic again? Simply put, it’s because there are still companies out there that fail to understand that a penetration test, like any consulting engagement, is a garbage-in, garbage-out proposition. If you don’t start an engagement with a responsible and appropriate level of information, you’re going to end up with a deliverable that’s neither responsible nor appropriate. You can expect to miss expectations when they’re not set at the outset.

We were recently responding to an RFP that included penetration testing of a number of cloud solutions. During the Q&A we asked whether we would be presented application and infrastructure architecture information, credentials, or source code. The response was a resounding “no,” the information would not be made available. It was quite a let down that an organization of this stature that surely had penetration testing done on these solutions in the past would fail to prepare appropriately for a penetration test. The overall responses throughout the rest of the Q&A provided much that same perspective – that a “hacker” wouldn’t have this information so why should the pen tester? That’s a fabulous question, and it deserves some exploration.

In a traditional (legacy) IT architecture, the process of discovering the bits and pieces of a solution is simplified. Take for example a penetration test of an ERP system. In a legacy environment, a penetration test could be done from a black-box approach (to a degree), as the components of the system are generally confined to a logical network segment typically deployed in the same physical data center. This makes the process of discovering and identifying the components – front-end, application and batch servers, deployment, reporting, and database servers – identifiable through ports, protocols, and naming conventions. Whereas modern architecture – aka “cloud architecture” – makes the discovery process exponentially more opaque. Even using credentials to the platform still wouldn’t provide a discovery capability that would uncover all the components of the solution. The test would ultimately be incomplete and could leave the organization with major security vulnerabilities.

The question that remains is whether this is “too much” information. Must a completely transparent white-box test be performed for every cloud solution that’s the target of evaluation? No, of course not.

Coalfire has adopted a strategy that strikes a balance between these two approaches. In many circles it’s called a “grey-box” approach, but our approach includes an evaluation of how the components are configured, which makes it closer to white. Maybe call it a beige approach, or off-white. Or maybe “sandalwood.” That was the color of my first car, a 1972 Buick LeSabre. Sounds fancy but at the end of the day it was kind of a dirty yellow-ish.

 

White Box

“Off-white” Box

User account

Yes

Yes

Administrative account

Yes

No

Infrastructure architecture

Yes

Yes

Software architecture

Yes

No

Tool usage

Low

Moderate

Architecture skills needed

Moderate

Moderate

Development skills needed

High

High

Attack surface

All

All

Source code

Yes

No


Our clients benefit from this approach by getting a thorough test of their solution, without their proprietary source code being exposed, without the team being mired in the nuances of software architecture patterns, and without the implied overkill of leveraging administrative accounts for discovery. The approach promotes a collaborative and consultative test methodology that is driven by examination of the business logic and is augmented with the use of automated tools, and takes into consideration the configuration of the supporting cloud services.

I’m still holding out hope for certain things in the cybersecurity industry. I hope that there will come a day that I no longer have to lecture clients on their weak passwords; that information security regulations will be seen as what they are – the minimum bar and not something to be “argued down;” and now, I am hoping for the quick demise of black-box cloud penetration testing in favor of an informed and appropriate approach, putting the days of the “unprotected storage” or “leaky database breach” behind us.

How can we help?