With more and more government agencies, like the EPA, Department of Energy, and the VA embracing mobility, branch offices, and teleworking there’s an equal demand to embrace data management and protection strategies to ensure immediate data backup and recover and the continuity and the security of operations. Not only do agency IT managers need to worry about having more end points to protect and an increasing number of possible sources of threat, but to make things even more complicated, if they’re not able to keep track of data at the edge of the network, then the legal and regulatory consequences sky rocket.
Protecting and recovering data at remote sites and at end-points using legacy technologies can consume scarce resources, require expertise that often isn’t present, and is prone to failure. In addition, these isolated systems can be expensive to deploy and manage. In the case of a site-level disaster—such as natural disasters, human error, viruses, and security breaches—organizations need to have a copy of important data stored safely, far away from the remote office that holds the original copy, but that is still easily and rapidly accessible. Without a centralized strategy, organizations are at risk for lost data, which can result in reduced operational capability and employee productivity as well as a range of other operational consequences.
While there are likely as many solutions as there are threats to data – from hardware, to software, to usage policies, there are some best practices to help federal IT practitioners navigate their data away from the edge, metaphorically speaking, into safer territory.