Prescriptive Data recently elaborated details of Nantum, a cloud-based platform that delivers intelligent automation for the management of real estate. Nantum integrates sensor data from a multitude of sources to deliver actionable business intelligence about the performance of a building with respect to metrics such as energy consumption, occupancy and expense utilization. Specifically, the Nantum platform ingests and aggregates sensor data sources that include subway and traffic data to proactively understand the occupancy of buildings as a means toward optimizing the delivery of energy. Nantum’s machine learning analytics can predict seasonal and local variations in occupancy as a result of variables such as weather, holidays and notable news-related events. Moreover, the Nantum platform natively integrates with infrastructures for the delivery of energy within buildings to intelligently automate the usage of energy in different buildings and their constituent floors and rooms. Built using MongoDB 3.2 Enterprise Advanced, Prescriptive Data’s Nantum platform delivers real-time analytics to building operators via a dashboard containing actionable business intelligence as illustrated below:
Whereas an earlier version of the application was built using Oracle and Microsoft via an on-premise deployment, the current version is delivered via the cloud using MongoDB Enterprise Advanced in conjunction with Amazon Web Services. The existing platform delivers prescriptive analytics that empower building operators to make decisions using machine learning-generated algorithms. Building operators can elect to implement the recommendations from Nantum’s smart building platform or customize them as desired. Importantly, the platform delivers recommendations about building management based on real-time data and iteratively optimizes the accuracy of its prescriptive analytics using machine learning technology. Expect to hear more about Nantum as it brings it smart building analytics to more buildings across the U.S. via the intersection of MongoDB-based big data, the AWS cloud, predictive analytics and real-time data feeds that produce interactive data visualizations for building owners and operators.
On Wednesday, Snowflake Computing announced enhanced automation for its cloud data warehouse that specializes in storing and analyzing structured and semi-structured data. Snowflake’s Elastic Data Warehouse now boasts improved capabilities for automated scaling to accommodate dramatic increases in concurrency, thereby preserving the speed and optimization of queries in scenarios marked by spikes in storage capacity, user activity and writes to the database more generally. The platform’s augmented automated scaling functionality extends to the management of distributed data and metadata by allowing customers to ensure optimal query performance without manual re-distribution of data fields and taxonomies as the database scales. In addition, Snowflake’s data warehouse optimizes dashboard and reporting capabilities by streamlining the results of frequently performed queries and accelerating the query results that they deliver. Moreover, the company’s Elastic Data Warehouse allows customers to travel back in time to customer-defined milestone points in the history of the database, thereby empowering customers to expeditiously recover data or retrieve data history as a point of comparison. Snowflake Computing’s ability to “milestone” the database enables it to deliver high availability and disaster recovery and thereby ensure business continuity for applications that leverage mission-critical data stored within its infrastructure.
Taken together, the July 20 product-related enhancements from Snowflake Computing usher in increased automation and performance optimization for its cloud-based data warehouse that boasts the ability to store and query big data, including JSON documents, using SQL. As enterprises increasingly seek cloud-based data warehouses as a solution to manage petabytes of siloed data across a bewildering multitude of hardware and software platforms, expect Snowflake Computing’s Elastic Data Warehouse to expand its footprint in the enterprise data warehouse space given the nexus of its simplicity, richness of functionality and fully managed quality that empower customers to focus on analyzing data instead of managing the operational process that undergirds the storage and retrieval of big data.
This week, Boeing Corporation announced plans to transition its cloud-based applications for aviation analytics to Microsoft Azure. Boeing plans to host all of its aviation analytics applications on the Microsoft Azure platform instead of using Azure as one of many cloud platforms. Boeing’s aviation analytics applications are used to optimize fuel consumption, obtain data about flight trajectories and proactively manage aircraft maintenance for planes that are used by over 300 airlines. Notably, Boeing’s decision to leverage Azure represents another important enterprise customer for Azure and further reinforces Microsoft’s competitive advantage in the cloud-based internet of things space given the surfeit of aviation-related devices that are likely to stream data to Azure in the wake of the deepening of the relationship between Boeing and Microsoft. Meanwhile, in related news, Microsoft elaborated how dairy company Land O’Lakes will transition its Winfield R7 application to Microsoft Azure. Winfield R7 delivers business intelligence that enhances the ability of farmers to make more informed agricultural decisions using data delivered via a mobile device. Land O’Lakes will also use Azure’s Cortana analytics applications to understand sensor data related to agriculture. Land O’Lakes had already announced a partnership with the Google Cloud Platform but the decision to additionally collaborate with Microsoft Azure underscores Azure’s growing traction amongst enterprise customers.
In the meantime, you’ll still be able to depend on and continue to invest safely in Cloud9. It’s still business as usual—we’ll continue to work with our Ace Open Source community and to provide our innovative services to you and our hundreds of thousands of customers worldwide. Over time, we’ll work with AWS to do even more on your behalf.
Daniels, here, notes how Cloud9’s “hundreds of thousands of customers worldwide” can continue to safely use Cloud9 for the immediate future although he expects the collaboration with AWS will enable the delivery of “even more” features and functionality, going forward. Given its focus on empowering developers to write software in the cloud, Cloud9 stands to enhance the ability of AWS customers to develop applications within cloud-based environments as opposed to writing code on-premise and then transporting that code to the cloud. Cloud9’s customers include Atlassian, MailChimp, Mozilla and Salesforce.com. The acquisition of Cloud9 by AWS enriches its portfolio of options for writing code directly within the Amazon Web Services infrastructure and, as such, gives it a competitive advantage over the likes of Microsoft Azure and the Google Cloud Platform. Terms of the acquisition were not disclosed.
This week, DigitalOcean, the developer-centric IaaS platform, announced the release of Block Storage, an SSD storage offering that enables developers to expand storage for their droplets, the IT infrastructure offered by DigitalOcean that offers compute capability as well as local storage. The availability of Block Storage on the DigitalOcean platform allows developers to more effectively scale their SaaS applications and subsequently dedicate more of their attention to designing applications instead of managing infrastructure. Block storage is separated from the DigitalOcean Droplet, thereby facilitating high availability by ensuring that data is stored independently of the Droplet-based computing infrastructure. Moreover, data stored within DigitalOcean’s block storage is encrypted at rest and transmitted securely to droplets. The availability of Block Storage on the DigitalOcean platform marks a notable addition to DigitalOcean’s platform insofar as it gives developers greater flexibility with respect to storage options for their applications, particularly in a contemporary business environment marked by an explosion in the volume and variety of data that feeds today’s software applications. Customers can store up to 16 TB of data using DigitalOcean’s block storage product offering at a cost of $0.10/GB per month.
On July 11, Microsoft Corporation announced a partnership with GE that will render GE’s Predix platform for the internet of things available on the Microsoft Azure public cloud platform. The announcement represents a notable coup for Microsoft Azure given that GE’s Predix is already hosted on Amazon Web Services, one of its principal competitors in the Infrastructure as a Service space. Built on Cloud Foundry, GE’s Predix platform is designed for the industrial internet and is subsequently optimized for data aggregation from a multitude of devices in addition to capabilities to deploy and build scalable applications that run analytics on IoT data. The partnership reinforces Microsoft’s leadership in the IoT space as exemplified by the Azure IoT suite and positions Microsoft to sharpen its competitive differentiation from AWS and the Google Cloud Platform with advanced capabilities for internet of things data capture, analytics and actionable business intelligence. Predix is expected to be generally available on Microsoft Azure by Q2 2017, with a developer preview in place by November 2016.
On June 28 at MongoDB World, MongoDB announced details of MongoDB Atlas, a database as a service product platform for MongoDB. MongoDB Atlas renders it easier for MongoDB users to deploy and manage MongoDB on a multitude of cloud platforms. Whereas MongoDB users previously needed to manage discrete cloud-based MongoDB deployments to ensure scalability, high availability and security, they can now take advantage of MongoDB Atlas to automate cloud-related service operations across a plurality of cloud platforms. Dev Ittycheria, president and CEO of MongoDB, remarked on the significance of MongoDB Atlas as follows:
MongoDB Altas takes everything we know about operating MongoDB and packages it into a convenient, secure, elastic, on-demand service. This new offering is yet another major milestone for the most feature rich and popular database for modern applications, and expands the options for how customers can consume the technology – in their own data centers, in the cloud, and now as a service.
Here, Ittycheria comments on the ability of MongoDB Atlas to render MongoDB into a turnkey platform that allows developers to consume MongoDB as an on-demand service marked by elastic scalability. MongoDB Atlas delivers elastic scalability to cloud-based MongoDB deployments in addition to provisioning and upgrades as well as backup and recovery services. The elastic scalability delivered by MongoDB Atlas features automatic sharding functionality that allows for scaling with no application downtime. The MongoDB Atlas screenshot below gives customers a snapshot of metrics related to MongoDB deployments within the AWS North Virginia region:
As the graphic above illustrates, customers can use MongoDB Atlas to understand and monitor pricing across a multitude of instances. The larger vision of MongoDB Atlas, however, consists in its ability to deliver automation and oversight of MongoDB deployments across a multitude of cloud platforms, thereby giving customers a centralized platform from which to manage all of their cloud-based MongoDB infrastructures. MongoDB Atlas is currently available on Amazon Web Services although integrations with Microsoft Azure and the Google Cloud Platform are expected soon. The release of the platform marks a breakthrough moment not only with respect to enhanced capabilities for deployment and ongoing management of MongoDB but also with respect to data sovereignty and data governance, particularly in the context of multi-cloud, regionally dispersed hybrid cloud deployments. Expect MongoDB Atlas to facilitate increased adoption of MongoDB and subsequently expand its market share within the space of NoSQL, document-oriented databases.