Cloud Security INSIGHTS Archive: January 2018
Customers Have a Role in Reducing the Deluge of Cloud Breaches
By Teri Radichel
As the number of companies moving to the cloud increases, so do cloud breaches. In 2017, a variety of attacks on cloud systems occurred at major corporations and government agencies around the world. One of the most prevalent forms of cloud data leaks stemmed from improperly configured Amazon Web Services (AWS) S3 buckets. Organizations such as Verizon and Booz Allen Hamilton exposed credentials and sensitive data that existed in AWS storage buckets lacking proper configuration. These customers also failed to correctly encrypt the data.
In another instance, unprotected cloud credentials allowed attackers to create unauthorized resources in the OneLogin and DXC breaches. Improperly secured application programming interfaces (APIs) exposed sensitive data submitted by customers to a cloud security service. Elsewhere, exposed resources in open networks and unpatched applications are leading to Bitcoin miners and ransomware appearing on cloud servers and Docker containers.
AWS regularly releases new security features to help customers protect data stored in the cloud. Other cloud service providers are following suit. According to Werner Vogels, CTO of Amazon, 3,966 new major features were released in 2017 and by the time you read this, that number will likely be higher.
Many of those features are security related. Some of the new security features announced just before and during AWS re:Invent, Amazon’s largest annual conference, included S3 bucket security enhancements, security group descriptions, cross-region peering to protect sensitive data between two different geographic locations, and a new security service called GuardDuty that can monitor customer accounts for security problems. The API Gateway service can now run in an Amazon virtual private cloud specific to a single customer. DynamoDB tables no longer require public internet access.
Because some companies like Capital One have stated that they can be more secure in the cloud, other organizations may think cloud providers will take care of security for them. As Boyan Dimitrov explains in a recent re:Invent presentation on Compliance and Top Security Threats, “Amazon is responsible for the security of the cloud, but customers are responsible for security in the cloud.” This concept comes from the AWS Shared Responsibility Model, published by Amazon to explain which aspects of security it will take care of for customers versus the areas customers are responsible for securing.
Customers must turn on and correctly configure security features to receive the protection those services provide. Amazon will not turn on by default features that can cause customers to incur costs. Customers must enable these services themselves. Organizations also need to continue to follow best practices when designing networks, applications and access to cloud resources. Ongoing maintenance of systems must ensure that customer systems remain securely configured after initial deployment.
Before moving sensitive data to the cloud, customers need to take the time to read the security best practices for the cloud services they are using. Customers should read their service level agreement with each cloud vendor to ensure they understand what the cloud platform will provide for security — and what they will need to do themselves to keep data safe. Documentation about the cloud platform as a whole will provide security architecture and process information.
Beyond that, individual cloud services such as AWS S3 have documentation that explains service-specific security controls and best practices. Customers need to familiarize themselves with all this documentation and, upfront, design systems for security that consider both deployment and long-term maintenance. Until customers fully understand their cloud security responsibilities and best practices, vendor improvements will have a limited effect at best to prevent breaches in the cloud.
Teri Radichel - CEO, 2nd Sight Lab