Government agencies are collecting vast amounts of data, but they're struggling just to store it, let alone analyze it to improve efficiency, accuracy and forecasts. On average, government agencies store 1.61 petabytes of data, but expect to be storing 2.63 petabytes within the next two years. These data include: reports from other government agencies at various levels, reports generated by field staff, transactional business data, scientific research, imagery/video, Web interaction data and reports filed by non-government agencies. While government agencies collect massive amounts of data, MeriTalk's report found that only 60 percent of IT professionals say their agency analyzes the data collected, and less than 40 percent say their agencies use the data to make strategic decisions. That includes U.S. Department of Defense and intelligence agencies, which on average are even farther behind than civilian agencies when it comes to Big Data. While 60 percent of civilian agencies are exploring how Big Data could be brought to bear on their work, only 42 percent of DoD/intel agencies are doing the same. Click on the image or the title to learn more.
Good article on risks of interpreting large scale data mining using google flu trends as an example. The conclusion that often data indications are a first point of a process that then requires scientific grunt work to identify if a given correlation is actually relevant or not and that reality is sometimes the grunt work wont be done because its hard, requires funding and takes time. Worth reading.