A Breach-Proof Public Cloud Database Security Program — Part 4: Asset Discovery

Sep 10, 2020 / by SecureCloudDB

In this 10 part series, we review the key components that are needed to formulate and apply a consistent, regimented cloud database security program that helps ensure data is only available through authorized access. Part 3 discussed public cloud database configuration best practices. Part 4 below reviews considerations related to asset discovery.  

Cloud databases can be moving targets, making it tough to know where databases - and their backups - exist. Simply put: you cannot secure data if you do not know if or where it exists. Thus, the extent to which an organization can keep track of the data it is responsible for, the better chances it has of protecting it. 

 

A Thorough Cloud Database Security Program Covers More Than Just “Databases” 

What is a “cloud database”? The answer may be expanding and it’s worth considering when building a cloud database security program. Recognizing and accounting for cloud dynamics in database security is integral to creating an effective security program. In the cloud, the nature of databases has changed. Consequently, one could argue that the definition of a database is different in the cloud than it is in an on premises data center; that a cloud database is whatever an engineering team dreams up and maps out. 

This means one might consider Amazon Simple Storage Service (Amazon S3) with Amazon Athena to be a database although Athena is a query service that enables data analysis directly from S3 using standard SQL and Amazon S3 is an object storage service that houses data. Another might consider Amazon Redshift, “the most popular and fastest cloud data warehouse” a database (while others would argue that a database records data while a data warehouse is used to analyze it). Even Cloud Economist Corey Quinn has been known to call out Amazon Route 53 saying, “I frequently joke on Twitter about my favorite database being Route 53, which is AWS’s managed database service.” Although Corey’s comment is made in jest, it’s hard to deny that Route 53 is a key-value store. 

The point here is not to get into semantics, but rather to underscore the importance of securing data across all services and your entire environment in the cloud, especially because on demand access means instances can spin up and down on their own. It’s necessary to expand protections beyond “databases” to include services such as AWS RDS, DynamoDB, or ElasticSearch so that your security program can reflect all potential attack surfaces. 

 

Asset Discovery and Inventory

On account of the ephemeral and short-term nature of the cloud, it is not uncommon for an organization to be unaware of what data they actually have and where it is stored. Additionally, because of the capacity for users to self-provision, security team visibility into public cloud environments can be opaque when compared to on-premises data centers, especially if multiple cloud accounts are used. Add to that poor tagging practices resulting from lack of oversight on on-demand capabilities and it can be really tough to track assets down and ensure nothing is missed.

Regularly running scans to identify and inventory public cloud databases will enable you to locate all assets, which will serve to strengthen your security posture, protect against breaches, and support audits. This is where it pays to include multiple cloud services. How can you protect what you don’t know exists? Asset discovery must be an ongoing process in order to detect when a database or asset is added, deleted or changed.

Pro tip: Create an inventory in the public cloud by collecting the list of cloud accounts (along with read-only credentials for each account) and querying the APIs for each of the different possible database types. Employing automated tools or a third party vendor ensures that this process is both efficient and inclusive. 

Ultimately, when you look at your inventory, you need to be able to see what it looked like over a period of time as well as today. For example, were all the nodes in the cluster properly secured and were any accessed or attacked, even if that particular node no longer exists. Employing an automated system goes a long way in accurately capturing all databases and their activity (more on that later). In fact, given the dynamic and ephemeral nature of cloud environments, it’s practically a requirement. APIs can be used to gather and update the inventory and maintain a full historical context. Ideally, you want to minimize opening up access to your databases in unsafe ways. Users should, therefore, consider tools that simplify the process by leveraging the providers’ APIs with a non-invasive, light-weight proxy such as a self-updating, zero touch agent for more detailed analysis and Data Activity Monitoring (DAM).  Short list tools that are made specifically for the cloud because they are designed to scale to thousands of accounts and hundreds of thousands of databases. Best practices suggest locating and securing critical databases first.

Read part 5 of this series where we unpack vulnerability assessments including the step-by-step process and critical areas to check.

 




 

Hack-Proof AWS Databases in the Public Cloud 

✔  Stop attacks in their tracks with real-time Database Activity Monitoring

✔  Control vulnerabilities with the Security Violations Assessment 

✔  Demonstrate on-going progress with dynamic Risk Assessment Scoring

  Receive a gift card worth $25 USD when you set up and scan your environment using SecureCloudDB.   

Improve Your Security Posture Today

 

Tags: Security Program Series

Written by SecureCloudDB