by clicking on the page. A slider will appear, allowing you to adjust your zoom level. Return to the original size by clicking on the page again.
the page around when zoomed in by dragging it.
the zoom using the slider on the top right.
by clicking on the zoomed-in page.
by entering text in the search field and click on "In This Issue" or "All Issues" to search the current issue or the archive of back issues respectively.
by clicking on thumbnails to select pages, and then press the print button.
this publication and page.
displays a table of sections with thumbnails and descriptions.
displays thumbnails of every page in the issue. Click on a page to jump.
allows you to browse through every available issue.
FCW : November 15, 2012
18 November 15, 2012 FCW.COM BIG DATA Defense Department, the Energy Department and the U.S. Geological Survey. Six months later, the ball has started to roll: On Oct. 3, NSF and NIH announced their rst funding awards through the big data initiative. The eight awards, totaling $15 mil- lion, were granted to widely diverse projects. Among them are a collaborative effort by Carnegie Mellon University and the University of Minnesota to simulate language processing in the brain and a Brown University project that seeks to design and test algorithmic and statistical techniques for analyzing large-scale, heterogeneous and so-called noisy cancer datasets. "We ve barely scratched the surface," Suzi Iacono, a senior science adviser at NSF and co-chairwoman of the Big Data Senior Steering Group, told FCW. She added that more awards will be presented in the months to come. The steering group is the interagency team responsible for executing the administration s big data plans across 20 some agencies. It was chartered in 2011 and includes rep- resentatives from the Defense Advanced Research Projects Agency, the Of ce of the Secretary of Defense, DOE, the Department of Health and Human Services, and NASA. "Everyone wants to be part of this," Iacono said. "I think that s because we ll be able to accelerate the pace of discovery if we can mine the datasets we have. Truly, we ll be able to transform commerce and the economy and address the most pressing issues facing society." But of all the technology buzzwords that have crossed from Silicon Valley to politics and governance in recent years --- such as "open government" and "cloud comput- ing" --- "big data" is arguably the most vague. That is not necessarily the government s fault. After all, even the tech sector has trouble de ning big data. However, most agree that the concept refers to four attri- butes when dealing with digital datasets: volume, velocity, variety and veracity. Myriad government agencies "have been collecting data for over a hundred years now, and we nally have technol- ogy and the wherewithal to use it," said Dan Olds, founder of IT advisory rm Gabriel Consulting Group and chief editor of the blog "inside-BigData." Those reams of data are now so voluminous, they re nearly unmanageable. As the TechAmerica Foundation said in a report released Oct. 5, "Since 2000, the amount of information the federal government captures has increased exponentially. In 2009, the U.S. government produced 848 petabytes of data, and U.S. health care data alone reached 150 exabytes. Five exabytes of data would contain all words ever spoken by human beings on earth." The problems extend beyond what to do with that data and how to manage it. Federal agencies hope to harness the data for new insights and use it for the bene t of government and the public. New ways to organize and analyze data, in fact, could be a matter of life or death. "Imagine some kind of weather emergency, if we could have data from all the models of that emergency, then integrate that with real-time weather data and census data and give this to responders on the ground," Iacono said. "Being able to make these kinds of split-second decisions is...one of the holy grails of big data." Big data has already saved lives in smaller settings. In a pilot program launched in 2008 at several hospitals, big Volume: The sheer amount of data generated or data intensity that must be ingested, analyzed, and managed to make decisions based on complete data analysis. Velocity: How fast data is being produced and changed and the speed with which data must be received, understood and processed. Variety: Both structured and unstructured data generated by a wide range of sources. Veracity: The quality and provenance of received data. Source: TechAmerica Foundation
October 30, 2012
November 30, 2012