by clicking on the page. A slider will appear, allowing you to adjust your zoom level. Return to the original size by clicking on the page again.
the page around when zoomed in by dragging it.
the zoom using the slider on the top right.
by clicking on the zoomed-in page.
by entering text in the search field and click on "In This Issue" or "All Issues" to search the current issue or the archive of back issues respectively.
by clicking on thumbnails to select pages, and then press the print button.
this publication and page.
displays a table of sections with thumbnails and descriptions.
displays thumbnails of every page in the issue. Click on a page to jump.
allows you to browse through every available issue.
FCW : October 15, 2012
It is estimated that U.S. government agencies will add a full Exabyte of data to their data stores during the next two years---the equivalent of over 62 million 16 GB iPads! When the rate of data growth is combined with the velocity or bandwidth required to move all that data, much of it unstructured data created by video, audio, social media and other means, the issue becomes clear. e problem is one of Big Data---data sets whose size and complexity is beyond the ability of standard tools to capture, store, manage and analyze within a tolerable elapsed time. "Organizations are at an in exion point with respect to their data. It will become di cult to do business as usual," says Dale Wickizer, Chief Technology O cer for the U.S. Public Sector at NetApp, a leader in storage and data management. "If something doesn t change, the data will bury you and become a huge cost and risk burden to the infrastructure and the mission. But if you gure out how to harness it, it can become an asset." A recent MeriTalk study bears this out. Overwhelmingly, agency executives want better ways to unlock and harness agency data to improve e ciency, speed decision- making, and improve forecasting ability. Agencies estimate that today, they have just 49% of the data storage and access, 46% of the computational power and 44% of the personnel they need to leverage Big Data and drive mission results. T NetApp has a suite of solutions designed to address all aspects of Big Data, including high performance computing (HPC), based upon NetApp s modular E-Series storage product. E-Series was designed with fast performance, physical density and scalability in mind. It can handle large, complex datasets that involve multiple storage systems without compromising data protection and integrity. Government agencies are taking note. e Energy Department s Sequoia Project at Lawrence Livermore National Lab is the fastest Blue Gene supercomputer in the world. It is capable of achieving 20 PetaFLOPS of peak performance, using 1.6 million processor cores, and 55 Petabytes of NetApp E-Series storage. at storage system will provide more than 1 Terabyte per second of write performance to the disk subsystem. Large HPC environments like Sequoia can literally run tens of thousands of high capacity disk drives. When high capacity disks fail, rebuilds can take a very long time and degrade system performance. To address this concern, the latest version of the E-Series Operating System (EOS) supports the use of Dynamic Disk Pools, an innovation that essentially virtualizes RAID protection and dramatically speeds the recovery of high capacity drives (from days, down to a few hours). Another NetApp technology highly useful for Big Data environments is the more familiar FAS Series of enterprise storage systems. It combines intelligent caching, integrated data protection, storage e ciency features, non-disruptive operations, and massive scale to create a multi-protocol, agile data infrastructure that can support the most demanding cloud and virtualized environments. For organizations with large, geographically distributed data repositories, NetApp s StorageGRID software and E-Series storage is often the best route. StorageGRID is an object-based technology, which can support billions of objects in a single object space, across numerous geographically dispersed sites. e latest version of StorageGRID supports the SNIA Cloud Data Management Interface (CDMI) for accessing data using open standards. Objects are distributed in accordance with a powerful policy engine, allowing those objects to be placed on di erent tiers of disk or tape, and at one or more locations to balance cost and protection. Strong error checking and correction at the software layer and in the storage guard against "bit rot." For even stronger protection, objects can Bi D Requires Bi Thinkin a I G meCh n er GAME CHANGING TECHNOLOGY TO MEET AGENCY MISSIONS Sponsored Con en BIG DATA To l o bout t solutio s N tApp o s, pl s visit app.c m/b a a. To l o bout t CDW-G d N tApp p t s ip, o to c w .c m/ app. W h B Da a b b t ll , t t t l t t l t t l ll . CD l t t t t , b t N tA t l , . Fo o i fo tio visit CDWG. o / t pp
October 30, 2012