Perspectives on Hadoop Part Two: Pausing Plans

                                                  By Merv Adrian and Nick Heudecker?

                                                  In the?first post?in this series?, I looked at the size of revenue streams for RDBMS software and maintenance/support and noted that they amount to $33B, pointing out that pure play Hadoop vendors had a high hill to climb. (I didn’t say so specifically, but in 2014, Gartner estimates that the three leading vendors generated less than $150M.)

                                                  In this post, Nick and I turn from Procurement to Plans and examine the buying intentions uncovered in Gartner surveys.


                                                  –more in Gartner blog–

                                                  Perspectives on Hadoop: Procurement, Plans, and Positioning

                                                  I have the privilege of working for the world’s?leading information technology research and advisory company, covering information management with a strong focus for the past few years on an emerging software stack called Hadoop. In the early part of 2015, that particular technology is moving from early adopter status to early majority in its marketplace adoption. The discussions and published work around it have been exciting and controversial, so in this post (and a couple to follow) I describe three interlocking?research perspectives on Hadoop: procurement (counting real money actually spent); plans (surveys of intentions to invest) and positioning (subjective interpretations of what the first two mean.)

                                                  Procurement Perspective: Hadoop is a (Very)?Small?Market Today

                                                  –more on Gartner blog–



                                                  Aspirational Marketing and Enterprise Data Hubs

                                                  In the Hadoop community there is a great deal of talk of late about its positioning as an Enterprise Data Hub. My description of this is “aspirational marketing;” it addresses the ambition its advocates have for how Hadoop will be used, when it realizes the vision of capabilities currently in early development. There’s nothing wrong with this, but it does need to be kept in perspective. It’s a long way off.


                                                  Hadoop and DI – A Platform Is Not A Solution

                                                  “Hadoop people” and “RDBMS people” – including some DBAs who have contacted me recently – ?clearly have different ideas about what Data Integration is. And both may ?differ from what Ted Friedman and I were talking about in our Gartner research note?Hadoop Is Not a Data Integration Solution?, although I think the DBAs’ concept is far closer to ours.


                                                  Stack Up Hadoop to Find Its Place in Your Architecture

                                                  2013 promises to be a banner year for Apache Hadoop, platform providers, related technologies – and analysts who try to sort it out. I’ve been wrestling with ways to make sense of it for Gartner clients bewildered by a new set of choices, and for them and myself, I’ve built a stack diagram that describes the functional layers of a Hadoop-based model.


                                                  2013 Data Resolution: Avoid Architectural Cul-de-Sacs

                                                  I had an inquiry today from a client using packaged software?for a business system that is built on a proprietary, non-relational datastore (in this case an object-oriented DBMS.) They have an older version of the product – having “failed” with a recent upgrade attempt.

                                                  The client contacted me to ask about ways to integrate this OODBMS-based system with others in their environment. They said the vendor-provided utilities were not very good and hard to use, and the vendor has not given them any confidence it will improve. The few staff programmers who have learned enough internals have already built a number of one-off connections using multiple methods, and were looking for a more generalizable way to create a layer for other systems to use when they need data from the underlying database. They expect more such requests, and foresee chaos, challenges hiring and retaining people with the right skills, and cycles of increasing cost and operational complexity.
                                                  My reply: “you’re absolutely right.”

                                                  Cloudera-Informatica Deal Opens Broader Horizons for Both

                                                  Cloudera‘s continuing focus on the implications of explosive data growth has led it to another key partnership, this time with Informatica. Connecting to the dominant player in data integration and data quality expands the opportunity for Cloudera dramatically; it enables the de facto commercial Hadoop leader to find new ways to empower the “silent majority” of data. The majority of data is outside; not just outside enterprise data warehouses, but outside RDBMS instances entirely. Why? Because it doesn’t need all the management features database management software provides – it doesn’t get updated regularly, for example. In fact, it may not be used very often at all, though it does need to be persisted for a variety of reasons. I recently mentioned Cloudera’s success of late; it’s going to be challenged by some big players in 2011, notably IBM, whose recent focus on Hadoop has been remarkably nimble. So these deals matter. A lot. The Data Management function is being refactored before our eyes; both these vendors will play in its future. Read more of this post

                                                  Attunity Scores a Win With RMS CDC Support

                                                  Today’s email brought a reminder of an old, valued data format: RMS. When I posted about Attunity earlier this year, I noted the value of its replication and changed data capture (CDC) technology as the major software infrastructure vendors continue to look at ways to consolidate the management of their customer’s data assets. Attunity is in the rare position of having its software OEMd by many of them somewhere in their portfolios; IBM, Oracle, and Microsoft [edit – removed Sybase, listed due to my error] all use and sometimes resell Attunity’s technology. RMS is a more recent addition to Attunity’s CDC portfolio, and its win at Southeastern Freight Lines bodes well for a new addition to its revenue stream. Read more of this post

                                                  Decoding BI Market Share Numbers – Play Sudoku With Analysts

                                                  In a recent post I discussed Oracle’s market share in BI, based on a press-published chart taken from IDC data – showing Oracle coming in second. As often happens in such discussions, I got quite a few direct emails and twitter messages – some in no uncertain terms – about why the particular metric I chose was not sufficiently nuanced or representative of the true picture. I freely admit: that’s true. In general, market observers know Oracle is not typically placed second overall – but the picture is more complex than a single ranking. My point was, and is, that it’s too easy to slip into a “who’s on top” mentality that obscures true market dynamics. In this post, I’ll dig a bit deeper, and describe what different approaches or categorizations show us – and what they don’t. Finally I’ll talk about how much this matters – and to whom. Read more of this post

                                                  Oracle’s High BI Bar: Managed, Multifaceted and Actionable

                                                  Oracle’s newest BI release is massive, spans multiple product categories, and raises the bar for competitors in dramatic fashion. In my prior post I focused on its rollout and competitive posture. The market has waited a long time as the reconciliation of many moving parts was accomplished – most notably the convergence of the Hyperion Enterprise Performance Management (EPM) offering and Oracle Business Intelligence Enterprise Edition (OBIEE). Hyperion integration with its Essbase acquisition was not complete. In 2007, OBI’s newest release (10.1.3) was most notable in many eyes for its new Microsoft Office support. PeopleSoft and Siebel had been acquired some two years before that, and Master Data Management was already a topic of discussion then (2005). There was a long way to go. And analysts? Well, think of us as the kids in the back: “Are we there yet?”

                                                  Oracle has used its time, and its $3B per year investment in R&D, well. OBIEE 11g delivers a strong base for its customers to build upon, and for its own teams to continue fleshing out a very coherent vision of ready-to-consume, actionable analytics suitable for multiple roles, on multiple platforms, across the breadth of information available. Although there is much left to do, Oracle has laid out a clear path and articulated a differentiated message that offers ample reasons for anyone on other platforms to consider OBIEE, whether or not they are an Oracle customer. For this analyst, the big wins are the Common Enterprise Information Model, The Action Framework, the strong manageability focus, unified and enhanced user interaction for report and other forms of design and delivery, and BI applications.

                                                  Read more of this post




                                                                                                  Foreign exchange

                                                                                                  Super League