The software company now known as Actian has a long and convoluted history with roots going back to the Ingres database first developed at Berkeley University 40 years ago. That’s 1973 when Marvin Gaye was singing Let’s Get It On, Roberta Flack was Killing Me Softly with His Song and Carly Simon was berating whichever man or men with You’re So Vain.
Ingres source code was built upon by a bewildering number of companies but eventually CA took over the source code in 1994. Fast-forward a decade and CA released Ingres to open source at a time when Oracle, IBM and Microsoft had taken over the enterprise client/server database market but the OSS sector was sprouting serious alternatives. In 1995, CA spun out Ingres Corporation and in 2011, Ingres became Actian, which is now beginning a new chapter with a series of acquisitions.
Confused? You should be. So why all the latest change?
“We transitioned away from open source because the unwillingness of governments to change their underlying databases was a hard equation to solve,” says Actian CEO, Steve Shine, when we meet for coffee on a sweltering day in central London. “They don’t like the maintenance but when it boiled down it, it didn’t happen and it wasn’t going to be as high growth as we would have liked.”
So Actian searched for faster-growth opportunities, going on a buying spree to meet the current need for high-performance data analysis. Actian is still in the database game but it is also pursuing platforms for the analytics opportunity, fuelled by acquisitions of VectorWise two years ago and, in rapid sequence recently, ParAccel, Versant and Pervasive Software. The ambition: to create a powerhouse in, to use the term in vogue, Big Data.
“In selling mission-critical open source solutions as Ingres we found that a lot of companies were starting to push the limits of what they could achieve,” Shine says.
“They were looking at more and more data [from new sources] and a relational database management system is not optimised for Big Data. So they forced it onto more powerful hardware and copies of RDBMS systems but there’s a point where that eventually breaks.”
The fallout has been the rush of new databases, often non-relational and non-SQL, together with business intelligence systems geared to handle enormous data volumes and all on commodity hardware. Of course, Actian is far from unique on spying an opportunity in a market Gartner sees growing to a value of $232bn, but now the firm has a suite for corralling data into the database, integrating, storing and then interrogating it.
The market possibilities are broad. Actian customers include GE, Intuit, Nikon and Marathon Oil. Many are in the online world and span the gamut of human activities from gaming to dating to searching. The opportunity afforded to companies that get their IT infrastructure right for the new age of gushing information is not confined to the techie domain… it’s the possibility of disintermediating market-leading incumbents.
“It’s the chance to do what Amazon did to Waterstones,” says Shine. “[The old] software architectures are broken and can’t scale and buyers have to either take advantage [of the change] or protect [their existing positions].”
In its favour, Actian has deep experience and a string of admired software assets but hurting its chances is the competition from the gazillion-dollar giant stack players: Oracle, IBM, SAP, Microsoft and HP.
“They’ve missed the boat and they’re having a terrific time selling high-priced solutions based on proprietary hardware,” Shine mocks, but he knows that there will always be very large customers who prefer to buy from very large IT companies.
Actian also still has the challenge of integrating all those new assets. As Ovum analyst Tony Baer has written of the recent hat-trick of buys, “Making 1+1+1=3 is the challenge.”
A very interesting new turn from the protean company that was Ingres then, and a very long way to go.
Martin Veitch is Editorial Director at IDG Connect
PREVIOUS ARTICLE«Ghana Mobile: Waakye, Apps & Improving Africa
Due to the large volumes, ongoing generation, and varying long-term retention needs, the collection and analysis of logs and machine-generated data
Cloned and legacy data, plus legacy applications kept in service solely to access historical information, create additional storage, search, and eD