Big data holds the potential for the federal government to overhaul how agencies meet critical mission activities that serve the public and other critical constituents. In order to fulfill this potential, though, the field urgently needs standards and regulations on two fronts – data privacy and data trust.
These are two of the conclusions reached in collaboration sessions held by the Advanced Technology Academic Research Center and MITRE, with participants from government agencies, academia, and industry. The sessions were held in June as the second half of ATARC’s Federal Big Data Summit, and their findings have been released in a white paper.
The five sessions addressed the intersection of big data and the Internet of Things; how to drive innovation with big data; progress being made toward prescriptive analytics; the challenges and potential solutions for data privacy; and using big data and analytics in healthcare.
“The emergence of the Internet of Things (IoT) is presenting a massive challenge in regards to data management,” said Joe Kim, Senior Vice President and Global CTO of SolarWinds in a recent conversation with Federal Technology Insider. Kim shared that “due to the volume and variety of data IoT devices generate, it is making collaboration between technical engineers and policy-makers difficult.” He continued, “this is because the volume and variety of data means that sensitive, personal, and behavioral information that must be verified and protected becomes more difficult to identify. What happens if personal identifiable information is in a picture instead of text? This is all happening while the government faces shortage of skilled data managers.”
There’s no doubt that addressing these challenges will be hard. For instance, vulnerabilities in IoT devices will be difficult to patch without continuous monitoring. This is one area where standards are required for embedded system security, long-term operation, and ongoing maintenance updates.
The potential for Big Data to draw upon IoT-generated data to drive innovation in the government is significant. But Kim pointed out that in order to truly utilize big data analytics, applications will need to be able to manage the vast quantities of data within existing databases as agencies work through funding and IT modernization challenges. There will be a definite need to ensure database performance remains optimal and identify bottlenecks and insight into performance in order to manage costs and ensure the reliability that data scientists, and eventually policy makers, need in order to make use of this vast, new, national resource.