Tuesday, December 11, 2012

Technics: Big Data: An expert, Adrian Lane, searches for a definition

I'm not sure I got above 2/3rds of this definitional essay, but I'm (still) trying.  How about you?

-- Technowlb, refWrite Backpage technology newspotter, analyst, columnist




Dark Reading (Dec11,2k12)

What Is Big Data?

Big data is not about buying more big iron
Dec 07, 2012 | 01:44 PM |  1 Comment

by 
Adrian Lane


Dark Reading 

When someone says big data, what do you think of?
Do you think of mainframes? Data warehouses? Do you think of Oracle Grids, Exadata, or Teradata clusters?
Perhaps you think of Hadoop, MongoDB, Cassandra, or CouchDB? Or maybe it's any NoSQL database?
Or perhaps you think it's just a giant mass of data in one place?
If you read press articles on big data, then it's all of these things. It's my belief that no good definition of big data exists today. In fact, the term is so overused, and I think intentionally so, that it's almost meaningless. I want to address that problem here.
And I'll state up front that the big data phenomena is not because people are buying more big iron.
During the past year, I've spent an inordinate amount of time researching security in and around big data clusters. It has been a challenge; each time I think I have a handle on one aspect of what constitutes big data, I find an exception that breaks the conceptual model I've created. Every time I think I've quantified a specific attribute or feature, I find another variation of NoSQL that's an exception to the rule. It was even a struggle to just define what big data actually is, with definitions from Wikipedia and other sources missing several essential ingredients: In fact, the definition section of the Wikipedia entry on big data does not really offer a definition at all. All in all, this is one of the most difficult, and interesting, research projects I've been involved with.
I want to share some of the results of that research here because I think it will be helpful in understanding why securing big data is difficult, and how the challenge is not the same as relational platforms many of you are familiar with. In a future post, I'll discuss some of the fundamental differences in how big data systems are deployed and managed from a security perspective, but before I can talk about how to secure "it," I need to define what "it" is.
Yes, big data is about lots of data, of differing types, coming in at velocities that cripple most traditional database systems. But there are other essential characteristics besides size and the need for fast insertion, such as the ability to elastically scale as the data set grows. It's about distributed, parallel processing to tackle massive analysis tasks. It's about data redundancy to provide failure resistant operation, which is critical when computing environments span so many systems that hardware failures are to be expected during the course of operation.
And just as importantly, these systems are hardware-agnostic, accessible from complexity standpoint, extensible, and relatively inexpensive. These characteristics define big data systems.
The poster child for big data is Hadoop, which is a framework that at its core provides data management and query (map-reduce) services across (potentially) thousands of servers. Everything about big data clusters is designed to address storage and processing of multiple terabytes of data across as many systems as needed, in an elastic, expansive way. In fact, these clusters are so large that the prospect of failure increases to the point where it's probable a node will fail. Without elasticity, resiliency, and potential to process requests in more than one location, that makes big data different than the databases that have come before it.
But the reason why big data is a major trend is because of the convergence of three things: huge amounts of data with cheap computing resources and free (or nearly free) analytic tools. Enterprises and midmarket firms are all embracing big data not because they can suddenly afford to invest millions of dollars in data warehouse systems, MPPs, mainframes, or giant systems in-a-box. It's because they can now afford data analysis on massive data sets without spending much money up front. Cheap, commodity, or cloud computing resources with free and easy data management systems like Hadoop make it possible.
If you need to understand what big data is, then consider the characteristics outlined above. They should help you differentiate traditional systems from big data.
Adrian Lane is an analyst/CTO with Securosis LLC, an independent security consulting practice. Special to Dark Reading.

1 comment:

Blogger said...

YoBit lets you to claim FREE COINS from over 100 distinct crypto-currencies, you complete a captcha once and claim as much as coins you want from the available offers.

After you make about 20-30 claims, you complete the captcha and keep claiming.

You can click claim as much as 50 times per one captcha.

The coins will safe in your account, and you can convert them to Bitcoins or USD.