Oracle: All content tagged as Oracle in NoSQL databases and polyglot persistence
Why would the performance improvement be visible only a specific hardware?
Ellison said users can expect real-time analytics queries 100 times faster and online transaction processing that is two times faster as long as they are using hardware that supports the Oracle 12c database.
I’ll assume that this could only mean that these results will be seen when data fits in memory. And not that one will need custom hardware to enable this feature. As a side note, I’m not sure I’m reading the announcement correctly, but it looks like a paying Oracle database customer will have to pay extra for the in-memory option.
Can anyone explain how data can be stored both in columnar and row format?
Additionally, the software will allow people to store data in both columns (used for analytics) and rows (used for transactions) as opposed to only one method; Ellison described this function as being “the magic of Oracle.”
Magic has very little to do with databases and performance.
Original title and link: Two questions about the Oracle in-memory database ( ©myNoSQL)
With MySQL version 5.6 (and above), you have the ability to store and retrieve NoSQL data, using NoSQL commands, while keeping the data inside a MySQL InnoDB database. So, you can use NoSQL and SQL at the same time, on the same data, stored in the same database. And the beauty is that it takes just a few minutes to setup. This post will provide you with a quick lesson on how to setup NoSQL on a MySQL InnoDb database.
I see this trivialization of the term NoSQL quite frequently in the communications signed by Oracle: “Oh, you want NoSQL? Take memcached. Now shut up!” This is quite disrespectful to their customers and the developer community in general.
Original title and link: You want NoSQL? I’ll give you memcached ( ©myNoSQL)
The Oracle version of HealthCare.gov. Let’s see:
“Oracle was contracted to deliver the exchange,” Merkley said, “they promised it would be fairly delivered on time and it’s in complete dysfunction.”
Oregon has spent more than $40 million to build its own online health care exchange. It gave that money to a Silicon Valley titan, Oracle, but the result has been a disaster of missed deadlines, a nonworking website and a state forced to process thousands of insurance applications on paper.
Some Oregon officials were sounding alarms about the tech company’s work on the state’s online health care exchange as early as last spring. Oracle was behind schedule and, worse, didn’t seem able to offer an estimate of what it would take to get the state’s online exchange up and running.
The biggest reason Cover Oregon’s website lags behind is because Oracle didn’t meet its deadline and should have begun testing last May, rather than delaying until this summer when it was too late to resolve the problems it encountered, King said. Oracle has been paid handsomely by Cover Oregon for its consulting and software development. It’s received $43.2 million this year – accounting for $11.1 for hardware, $9.5 million for software and $22.6 million for consulting.
So even if you use Oracle for everything—hardware, software, and consulting payed with a paltry $43.2mil in 2013, you can still fail? What a surprise!
✚ Who’ll take the blame if HealthCare.gov and Cover Oregon would just switch their contractors?
✚ Could we have these played on repeat for those blaming MarkLogic for HealthCare.gov’s failure? Also for those that accepted this excuse?
Original title and link: Blame it on… Oracle style ( ©myNoSQL)
This HadoopSphere post list 8 data-related products that Oracle has in its portfolio. I’m not sure it’s complete though as I didn’t see TimesTen, Coherence, etc.
It’s nice to be able to tell your customers, potential and existing, that you have tools for everything. The tricky part is in integrating these, making them work seemlessly together, and being able to offer a clear picture to every user. Or if you are Oracle, you could charge customers for this part too.
Original title and link: Oracle’s Big Data Components ( ©myNoSQL)
Based on ESG’s modeling of a medium-sized Hadoop-oriented big data project, the preconfigured Oracle Big Data Appliance is 39% less costly than a “build” equivalent do-it-yourself infrastructure. And using Oracle Big Data Appliance will cut the project length by about one-third. For most enterprises planning to take big data beyond experimentation and proof-of- concept, ESG suggests skipping the idea of in-house development, on-going management, and expansion of your own big data infrastructure, to instead look to purpose-built infrastructure solutions such as Oracle Big Data Appliance.
This is an extract from Oracle’s whitepaper “Getting Real about Big Data: Build Versus Buy“. It’s a nice reading excercise to better understand how the database leader is positioning their Oracle Big Data Appliance compared to Hadoop’s commodity-hardware cluster.
I’d love seeing the equivalent paper from Hortonworks1.
The only reason I’m referring directly to Hortonworks and not also Cloudera is that the Hadoop part of Oracle Big Data Appliance is offered by Cloudera. ↩
Original title and link: Oracle Paper: The Cost of Do-It-Yourself Hadoop vs Oracle Big Data Appliance ( ©myNoSQL)