On Tuesday, machine data analytics vendor Splunk today announced a 100% uptime SLA for the Splunk Cloud, its cloud-based platform for operational intelligence. The 100% uptime guarantee represents the first SLA in the machine analytics industry that guarantees uptime to a degree that effectively dispels objections about the reliability of cloud infrastructures. Not only does the 100% uptime SLA assuage customer concerns about reimbursement for downtime, but more importantly, it asserts the confidence had by Splunk that the Splunk cloud is engineered to remain fully operational even if one or more of its constituent infrastructure components experiences a disruption. Splunk also announced price reductions of up to 33% that derive from economies of scale and increased efficiencies in addition to revealing more flexible service plans marked by scaling limits from 5 GB/day to 5 TB/day and 10 fold bursting capabilities designed to accommodate especially high spikes in customer workloads. Given that the Splunk Cloud is hosted on AWS, its price reductions come as little surprise given that AWS has cut prices over 40 times, including a significant price cut announced as early as March. That said, Splunk’s 100% uptime guarantee represents an impressive differentiator in a space where vendors have largely shied away from guaranteeing 100% uptime, although one would need to delve deeper into Splunk’s policies for remuneration to understand the real delta between 100% uptime and something fractionally close. Splunk’s expanded scaling options and security features for a virtual private cloud hosted on AWS, marked by no data commingling, in conjunction with slashed prices, continue to consolidate its reputation as the leader in machine data analytics space. Expect Splunk to expand its market traction on the back of its notable 100% uptime guarantee as the enterprise increasingly embraces the necessity of running analytics on machine data dispersed across a variety of infrastructures.
Amazon Web Services Follows Microsoft by Eliminating Inbound Data Charges
Amazon Web Services (AWS) promised to eliminate inbound data fees starting July 1 in a move that matched Microsoft’s recent announcement of the same with respect to its Microsoft Azure platform. Moreover, AWS slashed outbound data prices for up to 10 terabytes of outbound traffic per month from 15 cents to 12 cents per GB. After 10 terabytes of outbound data transfer within a month, the next 40 terabytes per month have been discounted from 11 cents to 9 cents (total: 50 terabytes) per GB. And the next 100 terabytes of outbound data transfer per month (total: 150 terabytes) will be discounted from 9 cents to 7 cents per GB. In a blog post, Amazon Web Services remarked: “There is no charge for inbound data transfer across all services in all regions. That means, you can upload petabytes of data without having to pay for inbound data transfer fees. On outbound transfer, you will save up to 68% depending on volume usage. For example, if you were transferring 10 TB in and 10 TB out a month, you will save 52% with the new pricing. If you were transferring 500 TB in and 500 TB out a month, you will save 68% on transfer with the new pricing.”
Microsoft announced its intention last week to eliminate inbound data transfer fees in the context of the case of Press Association Sport, a partner of the Press Association, the national news agency of the UK. Given that the Press Association Sport planned to upload “large amounts of text, data and multimedia content every month,” into Windows Azure, the CTO of the Press Association remarked on the benefits of free inbound data transfers as follows: “Estimating the amount of data we will upload every month is a challenge for us due to the sheer volume of data we generate, the fluctuations of volume month on month and the fact that it grows over time. Eliminating the cost of inbound data transfer made the project easier to estimate and removes a barrier or uploading as much data as we think we may need.” Amazon followed suit a week after Microsoft’s June 22 announcement. In a June 29 blog post, AWS CTO Werner Vogels indicated future price decreases from AWS were forthcoming as the company scaled and rendered its operations more efficient.
Quantifying Cloud Computing Market Share
Recent years have witnessed a proliferation of analyses about the size and relative market share of vendors in the cloud computing space. According to a post in GigaOM, UBS Analysts estimate that “the total market for AWS-type services will be between $5-to-$6 billion in 2010 and will eventually grow to $15-to-$20 billion in 2014.” Gartner, meanwhile, estimates that the IaaS market will grow from $3.7 billion in 2010 to $10.5 billion in 2014. Forrester predicts that IaaS spending alone will increase from $2.9B, projected to grow to $5.85B by 2015 in their recent report, Sizing the Cloud, Understanding and Quantifying the Future of Cloud Computing.
The discrepancies between these estimates of the current and future state of the IaaS space illustrate some of the difficulties specific to quantifying cloud computing market share, many of which of derive from the following reasons:
• The plurality of cloud computing modalities renders calculations of market share complex. While it’s true that the terms IaaS, PaaS and SaaS remain powerful terms for understanding cloud computing deployments, vendors are increasingly offering more than one variation of the IaaS, PaaS and SaaS trinity. Amazon’s Elastic Beanstalk, for example, constitutes a PaaS offering from the largest IaaS vendor in the space. Meanwhile, Red Hat offers an IaaS product called CloudForms alongside a PaaS offering known as OpenShift. Moreover, analysts may choose to include or not include SaaS, PaaS or consulting services from cloud computing products in their estimation of cloud computing revenue.
• Vendors often refuse to disclose cloud computing revenues, especially if they are privately held or otherwise multi-tiered businesses wherein cloud revenue is miniscule in comparison to revenues from other services. Amazon Web Services constitutes the paradigmatic example, here, but the recent acquisitions of Terremark by Verizon and Savvis by CenturyLink may serve as further cases in point, though most reports suggest that both Terremark and Savvis will function as independent business units within their parent company with detailed revenue breakdowns.
• Within the first half of 2011, Dell, HP, IBM, Oracle, Red Hat, Apple, Go Daddy and Microsoft have made increased commitments to cloud computing deployments in ways that promise to significantly impact the existing market share balance.
• The global nature of cloud computing renders quantification of market share challenging because many U.S. cloud computing vendors operate transnationally in partnership with other channel partners that may or may not report revenue in a transparent fashion. Consider Joyent’s partnership with ClusterTech and Qihoo 360 Technologies in China, for example, in this regard.
Despite these methodological difficulties, we can make some definitive statements about vendor revenue. Consider the following revenue data points, for 2010:
a. Amazon Web Services: $500–700 million
b. Rackspace: $100 million
c. Terremark: $37.5 million, prior to acquisition by Verizon
d. Savvis: $15.2 million, prior to acquisition by CenturyLink
e. Joyent: $10- 20 million
a. Salesforce.com: $1.3 billion
b. NetSuite: $200 million
c. Rightnow: $200 million
d. SuccessFactors: $200 million
e. Taleo: $200 million
Revenue for PaaS in 2010 is difficult to locate and widely believed to be miniscule. But given the sheer number and heterogeneity of cloud computing vendors and deployments, these numbers represent little consolation for analysts and investors seeking to understand trends in the cloud computing universe. How will Apple’s iCloud fit into this equation, for example? What about Facebook and Google? In what way will Microsoft’s Office 365 change market share in the productivity software space? Part of the difficulty of estimating cloud market share, here, involves the lack of a common set of standards for measuring the size of cloud computing deployments, in addition to the challenges specific to locating data for annualized cloud based revenue per vendor. Until inter-operability standards emerge, analysts will need to develop new methods of imposing discipline and rigor on the conglomeration of cloud computing forms. Meanwhile, vendors and customers alike should push for inter-operability standards that facilitate apples to apples comparisons of cloud offerings from vendors across the globe.
Go Daddy Offers IaaS Cloud Computing, With a Twist
Go Daddy’s recent announcement that it plans to enter the IaaS cloud computing market throws yet another twist into the contemporary evolution of the cloud computing space. Although competing directly with Amazon Web Services and Rackspace, the domain registration and web hosting company proposes an IaaS solution called Data Center on Demand that provides fixed server resources for a monthly fee in sharp contrast to the “elasticity” and “pay per use” attributes of IaaS cloud computing. Moreover, the marketing brochure for Go Daddy’s Data Center on Demand offering asks its customers whether they have professional IT staff, noting, “managing Data Center On Demand machines requires technical expertise.” The disclaimer about professional IT staff reveals that Go Daddy has yet to build user friendly management consoles that do not require the use of shell commands. The service does offer load balancing capabilities that “can load balance any volume of traffic among an entire network of machines” and are “amazingly simple to set up.” The twist in the evolution of cloud computing represented by Go Daddy’s cloud computing product concerns its use of fixed pricing for fixed server resources. Data Center on Demand is currently in a limited release version scheduled for full deployment in July. Go Daddy’s entry into IaaS cloud computing marks a strategic move to leverage its ubiquitous brand name and gargantuan customer base to make a dent in the cloud computing revenues of AWS and Rackspace. Expect small businesses with technically savvy resources to lead the charge amongst their initial round of customers. Larger enterprises are likely to continue to stick with Amazon, Rackspace and more user friendly, pay per use models for now.