Controling the Cloud

Posted by Andy Rivers on Jul 14, 2017 11:07:02 AM

In CyberSecurity

So, Verizon joined the esteemed ranks of companies impacted by configuring their Amazon S3 bucket to public. These ranks include: Booz Allen, the National Geospatial-Intelligence Agency (NGA), the WWE, the Republican National Committee (RNC), and many others.  

As I was reading through the recent news on Verizon and see what a wide-ranging issue this has become, I wanted to share a few thoughts. 

First, some background.  Verizon announced that the personal data of at least 6 million of its customers was accidentally leaked online.  Leaked data included customer phone numbers, names and PIN numbers.  The data leak was due to NICE Systems, a contractor, misconfiguring a security setting on an Amazon S3 bucket. 

In the case of Verizon (and the other companies referenced), human error was the root cause (which is is so often the case that an entire section of our CISSP training course is dedicated to it.) The default setting for an S3 bucket is private.  In other words, that means someone along the way went in and adjusted the setting. This is both frustrating and good news.  It’s frustrating in the sense that these companies did it to themselves; but good because they can do something about it. If this was some zero day exploit within the S3 architecture we would all be sitting back waiting on Amazon to respond, but it’s not, this is something we can tackle directly.

An obvious solution is to leverage the change management process to ensure that all settings within S3 are bound by the change process and are properly reviewed by the appropriate stakeholders. This is one of the areas that needs to be examined further.  As we remove technical complexity from our systems, the same activities that make it easier for us to do configurations also tempt us to by-pass our established processes and controls.

Think about what it would take to accomplish the same thing in your environment. In S3 you simply click a radio button from Private to Public, click “Yes” to “Are you sure?” Because let’s be real, does anyone ever click “No” to that? To do that same thing in your environment, you would have had a complex discussion between the system administrators, network engineers, developers, enterprise architects, etc. It would have taken exponentially longer, but all along the way you would have had to answer the question of “why do you need this?” This would have hopefully lead to the decision to keep the sensitive data private. But that ease of use mentioned earlier put this entire decision under the control of one individual who could have answered that question with a “because I want to.”  So the first step is to ensure that your current controls are being properly applied to cloud services such as AWS and having a zero tolerance approach as you would for any unauthorized change.

In addition to controlling these settings through your change control process, verifying this setting and accessibility of data on any cloud platform needs to be part of your regular vulnerability scanning program. It is not sufficient to just scan our network hosts for the latest patches; vulnerability scanning needs to be expanded to evaluate all of our potential attack surfaces, including our cloud services.

Another aspect to acknowledge is that Verizon and many of the companies mentioned above actually suffered the breach through no actions of their own; but due to the actions of sub-contractors working for them.  Remember that even if you sub-contract the work out, you are still ultimately accountable for the data and your reputation is on the line.  Think about it.  You recognize the names of the large organizations I mentioned in the introduction; whereas the smaller sub-contracting companies are not well known.

Here are some guidelines:

Know thy data. Make sure you know where your sensitive data is and validate that it is actually required. I recommend following what I call a ‘rabbit hole’ approach. Each of your critical data elements should have some authoritative system of record. Start there and follow all the rabbit trails out to see where and how that data is being used in your environment.  Augment this with regular data scans and other Data Loss Prevention (DLP) systems.

Don’t assume. You know the saying, “if you assume you make an a#& out of U and ME.” Take that one step further and don’t assume that your sub-contractors are diligently following all the necessary security protocols, you can elaborate the common saying to say, “I won’t let U make an a#& out of ME”. Be sure to include all the necessary security language in your contracts, but also be sure you are doing your necessary due diligence to verify they are handling your data correctly. Because remember it’s your data, you are ultimately responsible for it and for upholding your reputation. 

Trust but Verify. Trust your teams to make the right decisions to follow all the necessary change control processes, but ensure you are verifying these configuration by including in vulnerability scanning processes.