A few years ago a terabyte of data seemed like an unfathomable amount of information that only a few highly sophisticated organizations would possibly generate. Now, we’ve blown past the terabyte, and are well into storing petabytes and exabytes of data. Beyond storing the data, which is a huge challenge in and of itself, the data generated by expanded web presence and the Internet of Things, is only useful if it can be leveraged, mined, or otherwise manipulated to generate insights about mission critical activities, citizen services, and other key government responsibilities. But how can this crucial activity, that requires millions of database inquiries, be accomplished without causing the database to crash?
For some, the liberal application of new technology is the only answer. But for those with a better understanding of existing infrastructure, leveraging the underutilized capabilities of existing tech investments provides a budget-savvy solution. For instance MySQL is often underutilized in terms of its ability to process data for mining and opportunity to then leverage the predictive insights from that data for mission planning. Not only are there architecture and processing capabilities that haven’t seen the light of day, but also advantages in data security and data audit, such as the ability to anonymize data in MySQL prior to sending it on through the system for further analysis in Hadoop.
Recently, Oracle explored these key big data issues in an in-depth webinar, which you can listen to here or check out their overall solutions for government agencies here.