Puppet Enterprise 3.8 Brings Puppet’s IT Automation Technology To AWS, Hybrid Clouds, Bare Metal And Docker

IT automation vendor Puppet Labs today announces Enterprise 3.8 which features key updates to Puppet Node Manager and a new application called Puppet Code Manager. The recently enhanced Puppet Node Manager now features the ability to automate the initial provisioning of infrastructures in conjunction with rule-based logic and parameters that dictate when infrastructure should be rendered ready for production. Puppet Node Manager also supports the launch and configuration of Docker containers as well as a new Amazon Web Services module that takes responsibility for the deployment and ongoing management of AWS resources. As told to Cloud Computing Today in a phone interview with Tim Zonca, Puppet’s Director of Product Marketing, the AWS module allows organizations to leverage a unified IT automation interface for managing on-premise and cloud-based DevOps processes instead of Amazon’s indigenous orchestration tools and a separate Puppet interface for automating, streamlining and simplifying infrastructure management. The graphic below illustrates Puppet Node Manager’s user interface and the corresponding simplicity of its method for defining rules for infrastructure components:

Node-Manager-Shot-Updated

In addition to an enhanced Puppet Node Manager, Puppet also announces an application called Puppet Code Manager that allows customers to define their infrastructure using code and subsequently manage the code—as opposed to the infrastructure itself—as it traverses different components of the product and software development lifecycle. Puppet Code Manager allows IT teams to more expeditiously apply a consistent methodology for changing, upgrading and testing their fleet of infrastructure components. Meanwhile, Puppet’s Bare Metal provisioning tool Razor is now generally available for the discovery of bare-metal hardware and the provisioning of OS on that hardware. Taken together, today’s set of announcements represent yet another important step on the part of Puppet to consolidate its leadership position in the IT automation and orchestration space. Puppet’s ability to render its technology applicable to a variety of infrastructures and platforms such as Amazon Web Services and Docker containers punctuates its relevance for IT management more generally. That said, the obvious question for Puppet Labs is the degree to which its automation technology can keep pace with the bewildering rate of change specific to the cloud, Big Data and computing landscape, particularly as Big Data technologies continue their aggressive maturation and application development, as exemplified by Pivotal’s support for a hosted distribution of Cloud Foundry on AWS, moves in the direction of increasingly agile methodologies that value precisely the automated management functionality that Puppet delivers.

Categories: Puppet Labs | Tags: , , ,

Metanautix Releases Personal Quest To Enhance Access To Its Platform For Integrated Analytics For SQL, NoSQL and Hadoop Datasets

On Tuesday, Metanautix released Metanautix Personal Quest, a product that enables individuals to leverage the power of the Metanautix platform to perform queries on data stored in Hadoop, NoSQL and relational database formats. Individual users can use Personal Quest to perform integrated analytics on data stored in relational and non-relational formats to obtain an integrated view of data stored throughout an organization’s different applications and data repositories. Metanautix allows users to download Personal Quest to their machine and subsequently test the capabilities of the Metanautix data compute engine for an unlimited time period for data limited to a designated size and number of queries. Metanautix Quest’s distributed compute engine enables the joining of SQL and non-SQL data sources without complex ETL processes. The video below shows how the integration of Metanautix Quest and Tableau enables customers to join data from Teradata SQL data to MongoDB NoSQL data to obtain a more granular understanding of sales by product by means of a few simple drag and drop operations. The clip illustrates how Metanautix Quest can execute a distributed join to analyze store sales data stored in a Teradata database to product data stored within MongoDB to enable a comparative analysis of sales across product categories such as books, children, electronics and shoes by month. After a visual review of sales by product category in a Tableau workbook reveals that shoes had a significant impact on overall sales, users can perform another join to drill down on shoe sales by shoe type to learn that men’s shoes and athletic shoes were largely responsible for the spike in sales specific to the shoe category. The distributed join performed by Metanautix Quest on Teradata SQL data and MongoDB NoSQL data facilitates a speedy analysis by means of a user interface that requires neither ETL nor the migration of data to a centralized staging repository. As such, Metanautix Quest radically simplifies data analysis and data visualization given the proliferation of different kinds of datasets in small, mid-size and enterprise-level organizations alike. By giving individual users unlimited time-based access to Metanautix Personal Quest, Metanautix intends to underscore the power of its analytic engine for performing analysis on data stored in sources that include Hadoop, Teradata, MongoDB and other SQL and NoSQL data repositories.

Categories: Big Data, Metanautix | Tags:

Vormetric Releases Cloud And Big Data Security Report and Infographic

The following Infographic, courtesy of Vormetric, illustrates widespread adoption of cloud computing and Big Data technologies at rates of 80% and 30% respectively. However, the graphic also points to intense concerns on the part of IT Managers about cloud and big data security, particularly insofar as those security considerations involve lack of control of the location of data and inadequate encryption and security protocols and frameworks for the protection of data and sensitive environments.

Categories: Vormetric | Tags:

Pivotal Cloud Foundry Extends Its Reach With Support For Amazon Web Services

Today, Pivotal announces that Pivotal Web Services will render Pivotal Cloud Foundry available in conjunction with Amazon Web Services as its hosted infrastructure. By making Pivotal Cloud Foundry available via a virtual appliance that supports its deployment on Amazon Web Services, Pivotal extends its support for IaaS platforms that currently include VMware vSphere, VMware vCloud Air and OpenStack. As a result of its support of Amazon Web Services, Pivotal embraces the creation of hybrid cloud infrastructures for Cloud Foundry that feature a combination of VMware or OpenStack-based on premise environments as well as Amazon’s famed public cloud infrastructure. In addition to its availability via one-click integration with Amazon Web Services, Pivotal Cloud Foundry is available as an Amazon Machine Image in the AWS Marketplace. The screenshot below illustrates Pivotal Cloud Foundry’s integration with Amazon Web Services alongside a bevy of other integrations and tools for managing a Pivotal Cloud Foundry deployment:

Pivotal Cloud Foundry Ops Manager with AWS Tile

Now generally available, Pivotal Web Services with Enterprise support manages an AWS instance on behalf of the customer, thereby absolving customers of the challenge of managing the AWS environment as it scales and morphs in relation to the demands of application and data ingestion.

James Watters, Vice President and General Manager, Cloud Platform Group at Pivotal, remarked on the significance of today’s announcement as follows:

With the latest Pivotal Cloud Foundry release, Pivotal becomes the first major middleware vendor to include managed public cloud capacity in a software subscription at no additional cost. By offering hosted public cloud along with dedicated self-install on either public or private clouds, Pivotal Cloud Foundry provides the instant-on affordable capacity Line of Business (LOB) executives need with the robust security and automation features IT can also bring to private clouds. With today’s release, LOB and IT can finally agree on a single platform.

Here, Watters notes how Pivotal includes support for Amazon Web Services in a Cloud Foundry subscription at no additional cost. Moreover, by supporting a private cloud, Watters remarks on how Pivotal delivers enhanced operational agility to Line of Business teams that may have an interest in leveraging a public cloud for development purposes in advance of the decision to transport their applications back to the on premise environments specific to their organization. All told, Pivotal’s support of Amazon Web Services for its Cloud Foundry distribution aptly exemplifies the quintessence of Pivotal’s mission of enhancing enterprise agile application development by means of cutting edge technologies at the nexus of cloud computing and application development. In addition, Pivotal’s support of AWS for Pivotal Cloud Foundry dramatically enhances the potential for Cloud Foundry-based application portability and moves the needle of cloud native application development toward enhanced interoperability and the adoption of open standards for contemporary computing.

Categories: Cloud Foundry, IaaS, Pivotal | Tags: ,

Microsoft Announces Integrated Azure Internet Of Things Suite

Microsoft CEO Satya Nadella recently announced the Azure Internet of Things Suite at Microsoft Convergence 2015 in Atlanta. The Azure IoT suite provides the capability to connect with devices and other things alongside the ability to acquire, analyze and visualize data generated from internet-of-things related devices. In addition to the ability to aggregate and analyze IoT data, the Azure IoT suite aims to deliver “finished applications” for use cases such as “remote monitoring, asset management, and predictive maintenance.” Azure IoT features Windows 10 for the Internet of Things as well as Azure Stream Analytics, a preview offering that offers the ability to ingest and analyze massive amounts of streaming data. The Azure IoT suite builds upon the Azure Intelligent Systems Service that Microsoft introduced in April 2014 to make “it easier for enterprises to securely connect, manage, capture and transform machine-generated data from Line of Business (LoB) assets, such as industry devices and sensors, regardless of OS platform.” Azure IoT represents the next step in the evolution of the Azure Intelligent Systems Service that elicited “overwhelming” demand from customers to the point where Microsoft decided to increase the quota of participants who had access to the preview release of the product. Specific details of the Azure IoT suite remain scant, at this stage, but we should expect Microsoft to release a more detailed elaboration of its integrated internet of things platform in the weeks to come.

Categories: Microsoft Azure | Tags:

Tachyon Nexus Secures $7.5M In Series A Funding From Andreessen Horowitz

As reported in The Wall Street Journal, Tachyon Nexus, the company that aims to commercialize the open source Tachyon in-memory storage system, has raised $7.5M in Series A funding from Andreessen Horowitz. Tachyon is a memory-centric storage system that epitomizes the contemporary transition away from disk-based storage to in-memory storage. Based on the premise that memory-centric storage is increasingly affordable in comparison with disk-centric storage, Tachyon caches frequently read files in memory to create a “memory-centric, fault-tolerant, distributed storage system” that “enables reliable data sharing at memory-speed across a datacenter” as noted in a blog post by Peter Levine, General Partner of Andreessen Horowitz. Tachyon’s memory-centric storage system improves upon the speed and reliability of file-based storage infrastructures to embrace the requirements of big data applications that require the sharing of massive volumes of data at increasing fast speeds. Tachyon was founded by Haoyuan Li, a U.C. Berkeley doctoral candidate who developed Tachyon at the U.C. Berkeley AMPLab. Tachyon is currently used at over 50 companies and supports Spark and MapReduce as well as data stored in HDFS and NFS formats. Tachyon Nexus, the commercial version of Tachyon, remains in stealth. Meanwhile, Peter Levine joins the board of Tachyon Nexus as a result of the Series A investment to support the development of what Levine envisions “the future of storage” in the form of Tachyon-based storage technology.

Categories: Big Data, Venture Capital | Tags: , , ,

Ford Partners With Microsoft Azure To Deliver Cloud-Based Services And Software Updates

Ford has announced that it will partner with Microsoft Azure to automate updates to automobile software such as its Sync 3 infotainment system as well as functionality that enables owners to check battery levels and remotely start, lock, unlock or locate their vehicles. As a result of the partnership with Azure, Ford vehicle owners with Sync entertainment and navigation systems will no longer need to take their cars to the dealership for periodic software upgrades, but can instead leverage the car’s ability to connect to a wireless network to download enhancements to Sync. The Azure-based Ford Service Delivery Network will launch this summer at no extra cost to end users. Use cases enabled by the partnership between Azure and Ford are illustrated below:

Ford's Cloud Connected Services Goes Global

Despite Ford’s readiness to use long-time technology partner Microsoft for the purpose of leveraging a public cloud, the Dearborn-based automobile giant prefers to use on-premise infrastructures for more sensitive data such as odometer readings, engine-related system data and performance metrics that reveal details about the operation of the vehicle. Moreover, part of the reason Ford chose Microsoft was because of its willingness to support a hybrid cloud infrastructure marked by an integration between an on premise data center environment and a public cloud such as Azure. As reported in InformationWeek, Microsoft will also help Ford with the processing and analysis of data given the massive amounts of data that stand to be collected for its fleet of electric and non-electric vehicles. Ford’s Fusion electric vehicle, for example, creates 25 GB of data per hour and subsequently requires the application of pre-processing and filtering procedures to reduce the amount of data to a point that renders its aggregation manageable for reporting and analytics purposes. Ford’s larger decision to partner with Azure represents a growing industry trend within the automobile industry to use cloud-based technology to push software updates to vehicles and gather data for compliance and product development reasons that includes the likes of Hyundai and Tesla. The key challenge for Ford, and the automobile industry at large, of course, will hinge on its ability to acquire internet of things-related automobile data and subsequently perform real-time analytics to reduce recalls, fatalities and facilitate more profound enhancements in engineering-related research and development. Details of which Ford vehicles stand to benefit from Azure-powered software delivery this summer have yet to be disclosed.

Categories: Big Data, Microsoft Azure | Tags: , , , , ,

Create a free website or blog at WordPress.com. The Adventure Journal Theme.