SAN FRANCISCO: Oracle has launched a direct rival to the Amazon Web Services (AWS) public cloud with its own Elastic Compute Cloud.
The product was revealed amid a flurry of cloud-related product announcements, including five in the infrastructure-as-a-service (IaaS) space, at the OpenWorld show in San Francisco on Tuesday.
Oracle Elastic Compute Cloud adds to the Dedicated Compute service the firm launched last year. The latest service lets customers make use of elastic compute capabilities to run any workload in a shared cloud compute zone, a basic public cloud offering.
“Last year we had dedicated compute. You get a rack, it’s elastic but it’s dedicated to your needs,” said Thomas Kurian, president of Oracle Product Development (pictured below).
“We’ve now added in Elastic Compute, so you can just buy a certain number of cores and it runs four different operating systems: Oracle Linux, Red Hat, Ubuntu or Windows, and elastically scale that up and down.”
Oracle has yet to release pricing details for the Elastic Compute Cloud service, but chairman and CTO Larry Ellison said on Sunday that it will be charged at the equivalent or lower than AWS pricing. For the dedicated model, Ellison revealed on Tuesday at OpenWorld that firms will pay half the cost for Oracle Dedicated Compute of the equivalent AWS shared compute option.
It is not surprising that Oracle would like the opportunity to have a piece of the public cloud pie. AWS earned its owner $2.08bn in revenue in the quarter ending 30 September.
Kurian shared current use details for the Oracle Cloud as evidence of the success it has seen so far. The firm manages 1,000PB of cloud storage, and in September alone processed 34 billion transactions on its cloud. This was a result of the 35,000 companies signed up to the Oracle Cloud, which between them account for 30 million users logging in actively each day.
However, Oracle’s chances of knocking Amazon off its cloud-leader perch, or even making a slight dent in its share, seem low. The AWS revenue was only made possible by the fact that Amazon owns 30 percent of the cloud infrastructure service market, with second and third-ranked Microsoft and IBM lagging behind at 10 and seven percent respectively.
Google and Salesforce have managed to capture less than five percent each. Indeed, realising how competitive the market is and Amazon’s dominant position, HP has just left the public cloud market.
Despite Oracle going head to head with AWS in the public cloud space, Amazon has been attempting to attract Oracle customers to its own platform.
“AWS and Oracle are working together to offer enterprises a number of solutions for migrating and deploying their enterprise applications on the AWS cloud. Customers can launch entire enterprise software stacks from Oracle on the AWS cloud, and they can build enterprise-grade Oracle applications using database and middleware software from Oracle,” the web giant notes on its site.
Amazon describes EC2 as letting users “increase or decrease capacity within minutes, not hours or days. You can commission one, hundreds or even thousands of server instances simultaneously”, making Oracle Elastic Compute Cloud a direct competitor.
Oracle has also added a hierarchical storage option for its archive storage cloud service, aimed at automatically moving data that requires long-term retention such as corporate records, scientific archives and cultural preservation content.
Ellison noted that this archiving service is priced at a 10th of the cost of Amazon’s S3 offering.
Kurian explained of the archive system: “I’ve got data I need to put into the cloud but I don’t need a recovery time objective. So you get it very, very cheap”, adding that it costs $1/TB per month.
The firm also launched what Kurian dubbed as its “lorry service” for bulk data transfer. This will see Oracle ship a storage appliance to a customer’s site, where they can then do a huge data transfer directly onto that machine at a much quicker rate than streaming it to the cloud. The appliance is then sent back to Oracle via DHL or FedEx, Kurian explained, for Oracle to then do the transfer on-site to the cloud for storage.
“This is much faster if you’re moving a huge amount of data. One company is moving 250PB of data. To stream that amount of data to the cloud would take a very long time,” he said.
Bulk data transfer will be available from November, while the archive service is available now.
“You can go up to shop.oracle.com as a customer, enter a credit card and you can buy the service, all the PaaS services and the storage service. We’re adding compute over the next couple of weeks,” Kurian explained.
“You pay for it by credit card or an invoice if you’re a corporate customer and pay for it by hour or month, by processor or by per gigabyte per hour or month for storage.”
Oracle Container Cloud, meanwhile, lets firms run apps in Docker containers and deploy them in the Oracle Compute Cloud, supporting better automation of app implementations using technologies like Kubernetes.
Oracle also launched additional applications that sit in its cloud, including the Data Visualisation Cloud Service. This makes visual analytics accessible to general business users who do not have access to Hadoop systems or the data warehouse.
“All you need is a spreadsheet to load your data and a browser to do the analysis,” Kurian explained.
Several new big data cloud services are also aimed at letting users more easily prepare and analyse data using Hadoop as the data store, for example Big Data Preparation and Big Data Discovery.
“With Big Data Preparation you can move data into your data lake, you can enrich the data, prepare it, do data wrangling, cleanse it and store it in the data lake. Big Data Discovery lets a business user sit in front of Hadoop, and through a browser-based dashboarding environment search the environment, discover patterns in the data, do analysis and curate subsets of the data for other teams to look at. It’s an analytic environment and complete Hadoop stack,” Kurian said.