The federal government is sitting on vast amounts of data that only keeps growing over time. From increased UAV and drone data in the military to data regarding buildings and other assets being tracked by FEMA in case of a natural disaster, our government is becoming more reliant upon massive volumes of data for enhanced decision-making.
So, what exactly is “big data”? According to Dan Vesset, program vice president of business analytics with IDC, big data is “a new generation of technology and architecture designed to economically extract value from very large volumes of a wide variety of data by enabling high-velocity capture and/or analysis.”
One of the challenges of big data is making it actionable by collecting, storing, preserving and analyzing it. This is no easy task. We require next-generation technology and solutions that will allow us to fully leverage all government data. And, not to mention, how this will influence cloud-computing initiatives, and will it have an impact on overall bandwidth usage?
Fortunately, Obama administration is moving ahead with its “Big Data Research and Development Initiative,” which is essentially pledging a $200 million commitment that will span six Federal agencies. This new initiative is based on recommendations from the President’s Council of Advisers on Science and Technology, which pointed out that the government was under-investing in big data technologies.
As a result, many industry vendors are already jumping on the “big data bandwagon” and are mapping out future solutions and innovations to meet this need as we speak.
I would implore all current and emerging players in this new government IT market to consider how this will impact the bandwidth performance for government agencies. Many agencies are already dealing with bandwidth issues, which is why WAN optimization solutions will be more vital than ever.
The ability for government leaders to have vast amounts of data at their fingertips for real-time and effective decision-making is very exciting. Federal agencies will more nimble and responsive, especially in disaster situations where lives are at stake, which you can’t put a price on.
We just need to be able to manage big data in a way that does not negatively impact network performance.