Last week I attended Oracle OpenWorld 2013 in the stunning city of San Francisco, along with 60,000 other attendees. At times it felt like we’d taken over the entire city, with every street, bus, billboard and hotel plastered in Oracle logos and pictures of engineered systems… although apparently there was some other stuff going on too.
I learnt a lot from OOW this year. I met many customers and potential customers, attended sessions from Oracle and its partners (including Violin’s competitors) and spent some time with friends at OaktableWorld as an antidote to the marketing hype. Oracle is many things to many people, but one thing that’s hard to deny is the company’s drive for innovation. Every year there are new products, new features, new options to learn – it’s very impressive. Of course, each one of these invariably means paying more license money – but those yachts don’t come cheap. This year as we looked to the future there were discussions about In Memory, Big Data, the Internet of Things and M2M. But what about the present? And more importantly, what about those of us still tied to the past?
The Database In Memory Option
In his opening keynote this year, Larry Ellison announced the Oracle Database In Memory Option, seen by many as an attempt to counter SAP’s HANA In-Memory database and Microsoft’s In Memory OLTP option for SQL Server 2014. This was by no means the only announcement of the week, or even the night (take, for example, the Oracle Big Memory Machine which, with 384 cores, I can’t help feeling would have been better named the Big License Bill Machine), but it’s a great example of the problem I want to discuss.
The obvious criticism is Oracle’s tiresome policy of pre-announcing and re-announcing the same thing (perfectly described by Doug Henschen here). The In-Memory Option isn’t available yet, nor will it be until “sometime next year”, which could conceivably be after OpenWorld 2014. But my real issue is that, like many other announcements, it’s a feature of the 12c database… which means almost everyone running in production won’t be able to use it.
Ok so maybe by the time Oracle finally rolls it out there will be some early adopters running 12c on their critical systems. But as the saying goes, you can always spot the pioneers by the arrows sticking out of their backs. Many people will refuse to upgrade to Oracle 12c until at least the release of version 2. And many, many more people simply won’t have a choice. We all spent a week talking about new or unreleased features that will change our lives, but how many customers will use them in production before next year’s slew of announcements?
The majority of organisations that I speak to are running legacy applications to support their businesses. The more risk-averse the business, the more ancient and convoluted the applications being supported (which is ironic if you consider the risk associated with maintaining old, complex code). Speak to any bank or telco and you’ll find applications from the previous decade running on versions of Oracle (or MSSQL, Sybase, etc) that you’d almost forgotten about. Scratch the surface and you’ll find lots of stuff on 11g Release 2, lots on 11gR1, plenty of stuff on 10.2 and maybe even 10.1. Dig really deep and horror of horrors, 9i is only just the beginning.
Not only that, but you’ll often find these databases aren’t even running on the terminal (i.e. supported) patchset! Why? Because upgrading an application or a database is a mammoth task, filled with risk and cost. I know I’m not the only one that has worked on 18+ month database upgrade projects which never even tasted success. Even applying a patchset requires full regression testing of an application – and if it’s a legacy application what are the support implications?
In my view, despite all the talk of new technologies and paradigm shifts, the need to refresh legacy applications is more relevant now than ever. I guess I see it more working for a company like Violin because replacing legacy storage with flash memory offers a massive win with relatively little risk. Upgrading to 12c, on the other hand, is not a project to be treated lightly – despite the promises of features such as the Database In Memory option. Many customers simply cannot afford the time, money or risk associated with upgrades and migrations, despite any potential rewards. Yet who is championing them?
I’m excited and intrigued by the new product launched by my employer Violin Memory, the Force 2510 Memory Appliance. I don’t usually use my blog to directly promote our products but this one interests me because it fits in below the list in the above picture, offering memory speeds without application or database changes. I hope to get one in my lab soon so I can blog what I see…