An interesting post on Teradata Aster blog which is indirectly emphasizing the weaknesses of the Hadoop platform:
- Make platform and tools to be easier to use to manage and curate data. Otherwise, garbage in = garbage out, and you will get garbage analytics.
- Provide rich analytics functions out of the box. Each line of programming cuts your reachable audience by 50%.
- Provide tools to update or delete data. Otherwise, data consistency will drift away from truth as history accumulates.
- Provide applications to leverage data and find answers relevant to business. Otherwise the cost of DIY applications is too high to influence business – and won’t be done.
It’s difficult to argue against these points, but they are not insurmountable. I’d even say that once the operational complexity of Hadoop deployments will get simpler—I think the Apache community, Cloudera, and Hortonworks are already working on these aspects—, Hadoop will see even more adoption and with that contributions addressing points 2 to 4 will follow shortly.
Yet another interesting part of the post is the two “equations” describing the two environments:
big clusters = big administration = big programs = big friction = low influence (Hadoop)
big data = small clusters = easy administration = big analytics = big influence (ideal/Teradata Aster)
I think these are revealing how Teradata Aster is positioning their solutions and where they see themselves making money in the Big Data market. It goes like this: “we can make a lot of money if we offer a platform with lower complexity and operational costs and higher productivity leading to better business results”. This is a sound strategy and the competitors from the Hadoop space should better focus on these same aspects which are essential to wide adoption.
Original title and link: Hadoop Weaknesses and Where Teradata Aster Sees the Big Data Money ( ©myNoSQL)