ALL COVERED TOPICS

NoSQL Benchmarks NoSQL use cases NoSQL Videos NoSQL Hybrid Solutions NoSQL Presentations Big Data Hadoop MapReduce Pig Hive Flume Oozie Sqoop HDFS ZooKeeper Cascading Cascalog BigTable Cassandra HBase Hypertable Couchbase CouchDB MongoDB OrientDB RavenDB Jackrabbit Terrastore Amazon DynamoDB Redis Riak Project Voldemort Tokyo Cabinet Kyoto Cabinet memcached Amazon SimpleDB Datomic MemcacheDB M/DB GT.M Amazon Dynamo Dynomite Mnesia Yahoo! PNUTS/Sherpa Neo4j InfoGrid Sones GraphDB InfiniteGraph AllegroGraph MarkLogic Clustrix CouchDB Case Studies MongoDB Case Studies NoSQL at Adobe NoSQL at Facebook NoSQL at Twitter

NAVIGATE MAIN CATEGORIES

Close

analytics: All content tagged as analytics in NoSQL databases and polyglot persistence

Big Data: Achieve the Impossible in Real-Time

Jean-Pierre Dijcks (Oracle):

The main components in the big data platform provide:

  • Deep Analytics — a fully parallel, extensive and extensible toolbox full of advanced and novel statistical and data mining capabilities
  • High Agility — the ability to create temporary analytics environments in an end-user driven, yet secure and scalable environment to deliver new and novel insights to the operational business
  • Massive Scalability — the ability to scale analytics and sandboxes to previously unknown scales while leveraging previously untapped data potential
  • Low Latency — the ability to instantly act based on these advanced analytics in your operational, production environments

Big Data Platform

If I would be picky, the only thing I’d change would be the order: 1) low latency; 2) massive scalability; 3) high agility; 4) deep analytics.

Original title and link: Big Data: Achieve the Impossible in Real-Time (NoSQL databases © myNoSQL)

via: http://blogs.oracle.com/datawarehousing/entry/big_data_achieve_the_impossible


Advanced Analytics and Big Data: Why put them together?

Philip Russom:

Here are a few reasons:

  • Big data yields gigantic statistical samples
  • Analytic tools and databases can now handle big data
  • There’s a lot to learn from messy data, as long as it’s big
  • Big data is a special asset that merits leverage

The last two points simply mean “The Unreasonable Effectiveness of Data“.

Original title and link: Advanced Analytics and Big Data: Why put them together? (NoSQL databases © myNoSQL)

via: http://tdwi.org/blogs/philip-russom/2011/04/the-intersection-of-big-data-and-advanced-analytics.aspx


Origin of BigData and How Hadoop Can Help

Michael Olson[1] about origins of BigData in an interview on ODBMS Industry Watch:

It used to be that data was generated at human scale. You’d buy or sell something and a transaction record would happen. You’d hire or fire someone and you’d hit the “employee” table in your database.

These days, data comes from machines talking to machines. The servers, switches, routers and disks on your LAN are all furiously conversing. The content of their messages is interesting, and also the patterns and timing of the messages that they send to one another. (In fact, if you can capture all that data and do some pattern detection and machine learning, you have a pretty good tool for finding bad guys breaking into your network.) Same is true for programmed trading on Wall Street, mobile telephony and many other pieces of technology infrastructure we rely on.

and how Hadoop can help:

Hadoop knows how to capture and store that data cheaply and reliably, even if you get to petabytes. More importantly, Hadoop knows how to process that data — it can run different algorithms and analytic tools, spread across its massively parallel infrastructure, to answer hard questions on enormous amounts of information very quickly.


  1. Michael Olson: CEO Cloudera, former CEO of Sleepycat Software, makers of Berkeley DB acquired by Oracle, @mikeolson  

Original title and link: Origin of BigData and How Hadoop Can Help (NoSQL databases © myNoSQL)


Types of Big Data Work

Mike Minelli: Working with big data can be classified into three basic categories […] One is information management, a second is business intelligence, and the third is advanced analytics

Information management captures and stores the information, BI analyzes data to see what has happened in the past, and advanced analytics is predictive, looking at what the data indicates for the future.

There’s also a list of tools for BigData: AsterData (acquired by Teradata), Datameer, Paraccel, IBM Netezza, Oracle Exadata, EMC Greenplum.

Original title and link: Types of Big Data Work (NoSQL databases © myNoSQL)

via: http://www.linuxinsider.com/story/71945.html


11 Big-Data Analytics Predictions for 2011

Maybe I’m over-simplifying it, but I’m reading Ketan Karia’s[1] “11 BigData analytics predictions for 2011” as in:

  • hardware will be pushing BigData analytics forward — software will just catch up and follow hardware lead
  • 2011 will bring more BigData analytics adoption which also means analytics companies will be more profitable

Here are the original 11 predictions:

  1. We’ll hark the chips, not the hardware

    Many companies keep throwing hardware (especially more servers) at the problem, and the chip industry’s enormous investment in computer performance continues to sit idly by.

  2. Chip scale-out will date MPP and shrink big data networks.

    […] we are going to have the capability to run 256 or 512 cores on a single chip. If companies can do that, it will shrink the amount of hardware required to fuel the big data networks.

  3. Memory will go RAM.

    As the core density of chips and the RAM size keeps rising dramatically, total in-memory data warehouses are now feasible.

  4. Chip companies will spend more on R&D in 2011

    It will become more common place for engineers to stay in tune with chip technology advancements as they build out their solutions to manage and leverage huge volumes of business data.

  5. Acceleration of analytics will support the agile enterprise

    Analytics technologies will help businesses be more agile and will become a key business differentiator in 2011 and beyond.

  6. Businesses will sponsor their own analytical capabilities

  7. Analytics gets more embedded into business applications

  8. Open source moves to more hybrid models.

    Hybrid models are being used by companies including JasperSoft, SugarCRM, and Ingres (with our VectorWise database).

  9. Subscriptions stack up by the hour.

  10. Self-service BI gets more attention.

    In 2011, companies will have to accelerate report delivery.

  11. Users want to be “in the moment” with data insights

What is the definition of “prediction”? A guess or a bet or an educated thought?


  1. Ketan Karia: Chief Marketing Officer and senior vice president at Ingres  

Original title and link: 11 Big-Data Analytics Predictions for 2011 (NoSQL databases © myNoSQL)

via: http://tdwi.org/articles/2011/03/16/Big-Data-Analytics-Predictions.aspx