Hortonworks Data Cloud optimized to spin up (and down) for AWS

Hortonworks Data Cloud optimized to spin up (and down) for AWS

Hortonworks today announced the general availability of Hortonworks Data Cloud for AWS, a new cloud service specifically optimized for enterprise ephemeral workloads on AWS. It is designed to integrate with AWS services including Amazon S3, RDS and EC2.

AWS customers have been able to deploy Hortonworks Data Platform (HDP) on AWS for years, but the new service is designed and optimized for ephemeral workloads that are spun up and down quickly. The data resides in Amazon S3, so when you shut down a cluster, the data is still there.

"HDP on cloud infrastructure as a service is great for lifting and shifting the long-running clusters that most folks are familiar with with HDP," says Shaun Connolly, chief strategy officer, Hortonworks. "Hortonworks Data Cloud actually goes after a different configuration. We've carved out specific, prescriptive user experiences for data science and exploration, ETL and data prep, analytics and reporting. Use cases that benefit from prescriptive, preconfigured, pre-tuned experiences out of the box."

Connolly says the new cloud service, powered by open source (including Apache Hadoop, Apache Spark and Apache Hive), delivers popular enterprise-grade capabilities of HDP, but with hourly and annual billing options in the AWS Marketplace. He says the benefits of the new cloud service include the following:

  • The ease-of-use of a cloud platform as a service with a pay-as-you-go billing model
  • A fast onramp to running the most common Apache Hadoop, Spark and Hive workloads in the cloud
  • A prescriptive experience configured and pre-tuned for the most popular use cases, enabling data scientists, developers and end users to be more productive
  • The ability to focus more time on processing and deriving value from data and less time configuring and operating data platform infrastructure

"We are enabling modern applications on a connected data architecture and believe customers should have a consistent data experience across cloud and data center," Connolly says. "Hortonworks Data Cloud for AWS gives customers an on-demand cloud service for a prescriptive experience for the most common Hadoop, Spark and Hive use cases with community support and the flexibility of hourly and annual billing through existing AWS Marketplace accounts."

[ Related: Hortonworks DataFlow 2.0 focuses on enterprise readiness ]

"Hortonworks Data Cloud for AWS is a leading enterprise-ready open source Apache Hadoop platform which enterprises depend on to enable the creation of secure data lakes and deliver the analytics they need to innovate fast and power real-time business insights," Barry Russell, GM of Global Business Development, AWS Marketplace and Catalog Services, Amazon Web Services, said in a statement Tuesday. "our customers want easy-to-use software like Hortonworks Data Cloud for AWS that is available for immediate purchase and deployment in the Marketplace. This new partnership demonstrates our joint focus on powering real-time customer applications and delivering robust analytics that accelerate decision making and innovation for our customers."

Hortonworks Data Cloud for AWS is built, in part, on the CloudBreak technology from Hortonworks' 2015 acquisition of SequenceIQ. Connolly notes that while CloudBreak was built entirely on the concept of a container-based world, Data Cloud has some container work under the hood, but it's not based on containers because some of the technology isn't hardened enough for production workloads.

"They actually have to get a bit more feature-rich, secure and operationally resilient for mainstream," Connolly says. "That will happen, I think, in 2017."

[ Related: Hortonworks ups its security and data governance game with HDP 2.5 ]

Still, it provides a hint as to where Connolly and Hortonworks see things heading.

"I think it's pretty exciting," he says. "From our perspective, we see this world of what we call assemblies; modern data apps that are built from assemblies, which are a set of containers wired together for specific use cases. There's still more work to do there. That, paired with a cloud service offering that does a set of use cases very simply, that's the requisite first step."

IDG Insider

PREVIOUS ARTICLE

«Intel packs more horsepower in its monster 22-core processor

NEXT ARTICLE

Super Mari-owned: Startling Nintendo-based vulnerability discovered in Ubuntu»

Add Your Comment

Most Recent Comments

Resource Center

  • /view_company_report/775/aruba-networks
  • /view_company_report/419/splunk

Poll

Crowdfunding: Viable alternative to VC funding or glorified marketing?