Days
Hrs
Min
Sec
 Fiscal 2017 budget deadline

 

Handling ‘Big Data’

October 18, 2011 — Big data doesn’t necessarily mean big headaches.

Let’s outline the problem – if you combine data from mobile devices, RFID, aerial sensing, software logs, and social media information you can crush a typical analyst.

Furthermore, information can reside in secure silos and proprietary data stores. The challenge for federal IT professionals is to derive deep insights from this proliferation of information.

GCE Federal has earned its stripes helping federal agencies in financial areas.

Sponsored content: Is your agency prepared to defend the next cyber attack? Download the results of 2016 network security survey.

With the advent of massive amounts of data being generated, GCE Federal has developed an expertise handling what is now called “Big Data.”

Advertisement

President Ray Muslimani give a good technical overview of a technology called Hadoop.

Hadoop originated in 2006 as an outgrowth of the open source Apache Project.

It can give you a way to manage terabytes of information. James Kobelius from Forester writes that “Hadoop will be the nucleus of next-generation data warehouses.”