As cloud-based data repositories continue to expand, so do challenges surrounding their security. Part of the problem is these repositories—which can store massive amounts of information from different sources—often rely on data from various cloud providers. One example is the repository used by federal agencies to keep track of contractor compliance for Cybersecurity Maturity Model Certification (CMMC), and there are many others throughout all levels of government.
Agencies are responsible for protecting the data held in these repositories. Cloud providers only guarantee the infrastructure; cybersecurity is the responsibility of agencies themselves. It’s imperative for agencies to take proactive measures to do whatever is necessary to secure and monitor their data repositories.
Agencies must implement practices to help strengthen their security posture while simultaneously delivering information on-demand to customers.
The “3-2-1” Rule
Let’s start by looking at disaster recovery and backups. As a best practice, agencies should consider implementing the 3-2-1 rule for backups to best ensure post-incident recovery. Adhering to this rule requires maintaining at least 3 copies of important datasets over at least 2 different types of media, 1 of which should be stored offsite and preferably offline, hence the term “3-2-1 rule.”
Rotating media storage components (disks, tapes, storage locations, etc.) for local storage adds a layer of protection, as offline media backups can be used for recovery if online media has been corrupted by malware. Data backup and snapshot technologies can be employed to make multiple local and offsite copies of important data. Cloud vendors can assist by storing agency-encrypted datasets in both short- and long-term digital storage.
And, finally, maintaining fresh copies and multiple network locations is critical to ensuring rapid recovery after an incident.
Tools to Successfully Secure and Monitor
Storing sensitive agency data on a cloud service provider’s platform demands stringent security and monitoring tools. One method is to employ an automated vault for this information, either using the cloud service provider’s existing management tools or a product provided by an independent third-party software vendor. An ideal secret management tool is platform-agnostic and has plug-ins for popular cloud platforms, containers, and coding frameworks.
In addition to backup and vault technologies, federal IT portfolios commonly employ log and event aggregators—also known as security information and event management (SIEM) tools—and privileged access management (PAM) tools for their systems and environment. A SIEM tool can give the team insight into what the baseline usage is, so anomalous behavior and access is more readily identifiable. PAM tools can provide privileged users with one-time passwords to temporarily give the user access rights for performing routine tasks that require stringent security controls.
Authentication should be further hardened by enforcing a secure password policy for all users, privileged and otherwise. Employ a password manager to ensure hardened passwords are readily accessible to your users and they aren’t forced to rely on memory or sticky notes.
Moving to the cloud can provide clear cost savings and a shortened procurement cycle. As agencies migrate data repositories and applications to the cloud, federal IT professionals find themselves needing to secure a larger and more amorphous perimeter. Applying existing tools and best practices, accompanied by changes recommended by bodies charged with oversight, will best protect agency data while providing the secure access end users require.