Even when we do start to be able to integrate and correlate event, configuration, vulnerability or logging data, it’s very IT-centric. It’s very INFRASTRUCTURE-centric. It doesn’t really include much value about the actual information in use/transit or the implication of how it’s being consumed or related to.
The major issue with data “lakes” is that for data to evolve into intelligence and knowledge requires a good understanding of the data itself – how else would one reconcile artifact ’A’ with variable ‘B’ and context ‘C’ generated from 3 separate data sources .
That’s very true. But good understanding of data without data or without the tools to process it is equal to zero.
Original title and link: BigData, Hadoop, and the Impending Informationpocalypse ( ©myNoSQL)