DigitalOcean today announced the release of load balancers for its cloud platform. The release of load balancers responds to one of the most highly requested products from DigitalOcean customers and testifies to the company’s increased commitment to supporting “larger, scale-out” applications. Julia Austin, CTO of DigitalOcean, remarked on the significance of the introduction of load balancers to the DigitalOcean platform as follows:
We’re quickly expanding the capabilities of our cloud to support larger scale-out applications. With Load Balancers, we are providing developers and businesses with a simple service for maximizing the availability and reliability of applications without disrupting the end user experience. Load Balancers is the first major new product DigitalOcean has released this year. Over the coming year, you’ll see us continue to release a number of important products and features to meet our customers’ high availability, data storage, security, and networking needs.
Here, Austin comments on how the availability of load balancers on the DigitalOcean platform promises to improve application uptime and reliability while delivering minimal disruptions to the end user experience. The introduction of load balancing to the DigitalOcean platform empowers developers to horizontally scale traffic to healthy droplets to ensure high availability for its applications. By making load balancers available on its cloud platform, Digital Ocean reveals an interest in rendering its cloud platform amenable to applications of a larger scale than those that have been featured on its IaaS platform to date. As Austin notes, for example, the company plans to release a multitude of products that address the evolution of the needs of its customers. All this suggests that, with nearly one million registered users and 40,000 teams actively using its platform since its cloud was launched five years ago, DigitalOcean is ready to expand its portfolio of products to satisfy the needs of increasingly large, complex and data intensive applications that may have additional storage, networking and data security requirements. As such, the release of load balancers on the DigitalOcean platform inaugurates a new phase in the company’s trajectory that underscores the success of its global reception and subsequent need to support increasingly complex and large-scale applications while nevertheless preserving the simplicity that constitutes a key component of its branding and product differentiation in the IaaS space.
Are you new to blogging, and do you want step-by-step guidance on how to publish and grow your blog? Learn more about our new Blogging for Beginners course and get 50% off through December 10th.
WordPress.com is excited to announce our newest offering: a course just for beginning bloggers where you’ll learn everything you need to know about blogging from the most trusted experts in the industry. We have helped millions of blogs get up and running, we know what works, and we want you to to know everything we know. This course provides all the fundamental skills and inspiration you need to get your blog started, an interactive community forum, and content updated annually.
The following interview with Amazon Web Services CEO Andy Jassy provides a rare insight into Jassy’s thinking about the genesis of AWS and competitors who either focused on building tools “further up the stack” or those, like Oracle and IBM, who spent an “ungodly long time” before joining the opportunity served up by the public cloud space. The interview illustrates the intent of AWS’s early mission of rendering enterprise grade infrastructure available to developers and startups with the concomitant vision of expanding its portfolio of services after achieving greater adoption.
In a recent blog post, Brad Smith, President and Chief Legal Officer of Microsoft, announced details of Microsoft’s plans to “protect innovation and investments as it applies to the cloud.” Citing research by the Boston Consulting Group that noted an increase in cloud-related intellectual property litigation over the last five years, Smith elaborated on Microsoft’s recognition of the risks associated with doing business in the cloud in the form of The Microsoft Azure IP Advantage program. The Microsoft Azure IP Advantage program gives customers unlimited indemnification coverage that now includes open source technologies such as Hadoop. In addition, the program allows customers to make use of roughly 10,000 Microsoft patents in order to protect themselves against charges of the infringement of intellectual property. The 10,000 patents draw upon Microsoft’s vast portfolio of its own patents and promises to give customers an unprecedented degree of legal protection for Azure-based applications. Moreover, Microsoft will offer continued legal protection to customers using one of its patents in the event that Microsoft decides to transfer one of its patents to another entity. Customers are eligible for Microsoft Azure IP Advantage if they spend at least $1000/month on Azure and have not filed patent-related lawsuits against other Azure customers.
The Microsoft IP Advantage program represents a key differentiator for Microsoft Azure against Amazon Web Services and the Google Cloud Platform given that it draws upon Microsoft’s extraordinary tenure in the tech industry and the slew of patents it has accumulated over the course of decades as one of the world’s premier technology companies. The program, which bears no parallel in the cloud industry at represent, underscores some of the out of the box thinking from the Azure team as Azure strives to bolster its differentiation from AWS and Google by tapping into Microsoft’s tenure as a software company as well as its deep relationships with enterprise customers. Microsoft’s Satya Nadella and Scott Guthrie will need to keep up the gas pedal on innovation, however, as Amazon Web Services and Google both continue to roll out new features and functionality at a breathtaking pace. For now, however, the Microsoft IP Advantage program demonstrates an extraordinary attentiveness to customer needs and marks another feather in Azure’s cap that allows it to stand out from other players within the cloud jungle.
Subsequent to its original IPO filing on February 1, Snap Inc. has disclosed an agreement with Amazon Web Services to spend a total of a $1B on AWS cloud services between January 2017 and December 2021 courtesy of an amendment to its original IPO filing. Under the terms of the agreement, Snap Inc. plans to spend $50M in 2017, $125M in 2018, $200M in 2019, $275M in 2020 and $350M in 2021. News of Snap Inc.’s agreement with AWS reveals its two pronged strategy with respect to public cloud providers given last week’s enumeration of its contract with Google for $2B in cloud services over the course of five years. Snap will use AWS for “redundant infrastructure support of our business operations” in a move that illustrates the strategic importance of a multi-cloud strategy for enterprises, particularly given corporate and investor concerns about vendor lock-in and availability. Snap Inc. is a camera company and parent of the popular messaging application Snapchat. Snap Inc.’s amended IPO underscores both the incipient success of Google Cloud’s strategy to court enterprise customers under the leadership of Diane Greene as well as the continued popularity of AWS within the enterprise in the face of intensified public cloud competition from Microsoft Azure, Google, IBM and Oracle.
Today, Qumulo announced the release of Qumulo Core 2.6, its data aware storage platform designed for the storage of massive quantities of unstructured and file-based data. Notable about this release of Qumulo Core is the implementation of native quotas in the form of storage quotas that are integrated into the file system. The introduction of native quotas to the Qumulo Core file system reduces the operational overhead associated with maintaining storage by absolving administrators of the need to abide by storage designations specific to legacy storage systems. In addition, Qumulo Core 2.6 features intelligent quotas defined by a policy responsible for the execution of real-time queries. The introduction of native quotas and intelligent quotas to the Qumulo Core 2.6 scale-out storage platform represents a significant milestone in Qumulo Core’s evolution by introducing “machine intelligent quotas” to the Qumulo Core platform that automate and simplify storage management at scale. The implementation of native and intelligent quotas to the Qumulo Core platform delivers enhanced operational agility for Qumulo customers, thereby complementing the platform’s flagship data aware functionality that gives users granular visibility into stored data. The nexus of Qumulo’s data aware functionality and its machine intelligent quotas consolidates its positioning as a leader in file and object based storage by underscoring the automation and operational efficiency specific to Qumulo’s platform for web-scale storage for on-premise, cloud-based or hybrid cloud infrastructures. Meanwhile, the announcement of Qumulo Core 2.6 comes in tandem with the release of Qumulo QC360, a hybrid storage appliance featuring the data aware NAS scale-out storage functionality that represents one of the pillars of the Qumulo brand. Expect Qumulo to continue building out its storage infrastructure as the contemporary proliferation of data-intensive applications drives the need for more intelligent storage automation driven by data aware analytics and enhanced operational efficiencies.
Palo Alto-based Minio recently announced the general availability of an object storage server designed for cloud-based environments. Minio empowers developers to store unstructured data in private and public cloud environments using a distributed object storage server featuring erasure code and bitrot detection capabilities. Minio’s ability to store massive volumes of unstructured data gives developers access to object storage with functionality analogous to Amazon S3 that can be extended to other cloud providers such as Digital Ocean and Packet. The platform boasts compatibility with Amazon S3, advanced data protection functionality and support for Lambda functions that automate the execution of scripts that operate on incoming or existing datasets. Available under an Apache version 2.0 license, Minio has been embraced by open source communities such as Mesos, Docker and Kubernetes that recognize its product differentiation as represented by its open source cloud-native architecture and streamlined deployment functionality. The company’s open source object storage platform claims over 125K code contributions and widespread deployment as a Docker container. The announcement of Minio’s general availability as a distributed object storage server marks a milestone in its evolution that builds upon its early adoption success and emergence as a key player in the cloud-native object storage space. Expect Minio’s adoption to continue expanding, particularly in light of its general availability announcement and deepening specialization in the delivery of cloud-native object storage solutions for massive quantities of unstructured data.