A researcher found a voter database from the Republican National Committee that was accidentally made public and...
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
exposed the information of 198 million voters. According to cybersecurity firm UpGuard, the voter database was hosted in an Amazon S3 bucket that had no authorization or cloud access control around it.
This story has become all too common in the last several years, unfortunately. In fact, the same company found a trove of classified national intelligence data in an Amazon Simple Storage Service (S3) bucket that was found to be public in May 2017.
In July 2017, 14 million Verizon customers' data was exposed in an unsecured S3 bucket controlled by Israeli software company Nice Systems. In 2015, 1.5 million medical records were exposed in Amazon Web Services (AWS) S3 after a software company handling health insurance claims left a database open to the public.
A lack of policies and controls
There are many lessons you can learn from these exposures. The first is that core security best practices have largely gone out the window when it comes to cloud services, and cloud storage in particular.
Perhaps even more concerning is the lack of policy control, governance and risk assessment related to sensitive data being put into the cloud in the first place. Why was any of this data sent into the cloud, and why was there little to no attention and scrutiny on the data once it was placed there? These are problems that enterprise IT organizations need to address, and quickly.
All cloud data needs to be classified using internal and compliance-focused classification efforts, and a cloud policy and risk review should ideally determine when and if data should be sent to the cloud at all. Assuming that data is approved for use in the cloud, there are many best practices organizations should follow to ensure that cloud use is carefully controlled and monitored.
Implementing cloud access controls
To prevent an issue like the recent Amazon S3 bucket exposures, organizations need a higher level of due diligence and monitoring. The use of a cloud access security broker (CASB) service can help in many SaaS and cloud services, tracking and controlling cloud use across the enterprise. In some cases, a CASB could prevent, or even detect and alert you when sensitive data is sent to cloud service environments.
Beyond that, it's up to security teams to set up and control access and permissions to cloud storage environments, and an Amazon S3 bucket offers a number of controls of which Amazon Web Services customers should avail themselves.
First, there are lots of access controls and permissions available for any Amazon S3 bucket implementation. For any sensitive data put into the cloud, companies should require both strict cloud access controls with identity management policies, as well as continuous monitoring and logging.
There are two main methods to control access to Amazon S3 buckets and data. The first method is the simplest, available through the S3 graphic console in the access control list (ACL) configuration settings. The S3 ACLs enable you to create basic cloud access controls for authenticated AWS users and any anonymous users. By default, the Amazon S3 bucket owner has read/write access to everything, including objects and files stored in the buckets and the permissions on the buckets themselves.
The second method to secure S3 buckets is more involved, but far more granular. This involves setting up bucket policies with AWS Identity and Access Management, with more specific policy-based access and auditing for both the buckets and their resources.
Once cloud access controls are set up, there are many tools that pen testers and vulnerability assessment teams can use to discover and assess bucket policies and general security posture. The AWS command-line interface tool can be used to manage buckets and list their policies remotely.
Independent penetration tester Robin Wood wrote a tool called Bucket Finder that can be used to brute force S3 namespace looking for Amazon S3 buckets. Other discovery and assessment tools include S3 Knock, Lazy S3 and AWS Scan.
Regardless of the tools used, organizations need to define policies upfront, control data flow to cloud storage using on-premises or in-cloud CASB and gateway platforms and services, and monitor continuously for S3 buckets and other cloud storage nodes that may not be secure according to standards and compliance requirements.
Learn more about the RNC database exposure
Find out what to do with your S3 apps when Amazon is down
Discover why Amazon S3 buckets keep spilling onto the web