Sharing data has never been easier. Technology advancements have made it effortless for organizations to quickly share information at the touch of a button. Even though data sharing itself may not be difficult, there are still challenges to be aware of, including accurate reporting, trust issues, and informed decision-making.
With regard to data sharing, Snowflakes’ Mark Kochanski, said that “historically, data sources flow through a number of different stops within an organization before they reach their ultimate consumer.” This convoluted journey has led to organizations building costly complex architectures with multiple different solutions involved that may require separate administrative skill sets and separate end user skill sets, whether the different infrastructure is either physical, virtual, or cloud based. The multiple touchpoints that the data flows through may lead to governance challenges where multiple copies of the data are reported. This becomes an issue when individuals at an organization receive different copies of data to complete the same work and then report different findings.
One field where not having a single source of truth can have significant impacts is in healthcare. Over the past year alone, healthcare systems have had to manage data sharing about the COVID-19 pandemic, such as outbreak management, PPE supplies management, and contact tracing. “Because of the situation, the data was sitting on so many different systems, there was no one version of the truth. It was a big limiting factor to many health departments, because every department was trying to pull in their own information. So, the answer to the same question would be different, and when that happens, that invariably creates issues with respect to the trust and reliability in the data,” said Deloitte’s Uday Katira, Managing Director Strategy & Analytics Practice.
Katira shared how a four-step approach helped healthcare organizations to overcome issues with data sharing.
- By setting up a data governance program, an organization is able to establish and identify “protocols for ownership roles and responsibilities, while defining what each metric means.” When making protocols consistent, it allows for the sharing of data to also be consistent in the way that different organizations can analyze that data.
- To assure reliable data, organizations need to “implement a robust data quality management process in the data pipeline itself, so that when the data comes in from the source to the centralized data hub, it goes through rigorous checks, quality standardization, enrichment process, and deduplication process,” said Katira. By creating a data hub to curate the data, organizations can rely and trust that data.
- Through automating the data pipeline, the organization is able to access the data in real time and analyze it. Katira emphasized that at this step, organizations can either perform “descriptive analytics, in terms of doing the operational reports, or doing interactive analytics through the use of advanced machine learning algorithms to predict certain of outcomes” and take action. Now, the shared data can be used to make informed, data-driven decisions.
- A vital issue to keep top of mind when it comes to sharing data is security. Organizations need to create a robust security and privacy data framework. For instance, “the data would be available to only those that are in need to see the data and act on the data. This is another aspect of the solution, which was enabled through various components of architecture,” commented Katira.
These measures are not only helpful in the healthcare industry, but also across the public sector. By partnering with trusted industry leaders to find solutions to provide accurate and secure data sharing, organizations can better succeed in their mission goals.