Cloudmeter’s Big Data Network Analytics Enables Integrated Application Performance Management

Cloudmeter today announces the general availability of Cloudmeter Stream, a non-invasive platform that enables customers to transform Big Data streams of network data into actionable business intelligence. Cloudmeter also announces the early access availability of Cloudmeter Insight, a SaaS application that integrates back-end network analytics with front-end marketing analytics to deliver integrated data regarding user experiences of application platforms. Together, Cloudmeter Stream and Cloudmeter Insight expand the purview of Big Data analytics to network data and enable customers to obtain a 360 degree view of user interactions with their products. Both Cloudmeter Stream and Cloudmeter Insight allow access to network data without risks of physical disruption to network infrastructures.

Cloudmeter’s analytics represent an extension of the DevOps movement by allowing operations to more effectively understand the impact of IT infrastructure on end-user experiences. Application owners can use Cloudmeter to effectively configure business rules to determine which network data attributes constitute fields of interest. For example, customers can create business rules that identify session errors, network traffic on specific servers or data regarding the elapsed time between specific interactions with the platform. Users create business rules and manage the application more generally using an intuitive user interface featuring screens such as the following:

Cloudmeter CEO Mike Dickey remarked on the innovation represented by the platform for capturing network data by noting:

Our new data capture technology is a culmination of many years of experience building network-based data capture products. It enables customers to gain real time access into the wealth of business and IT information without the need to connect to physical network infrastructure, and without introducing risk to production systems or application performance.

Dickey underscores how Cloudmeter’s technology brings the Big Data revolution to network data and concomitantly empowers customers to access “business and IT information” in ways that have the potential to transform both their marketing platforms as well as their IT infrastructure design. In an interview with Cloud Computing Today, Cloudmeter’s COO Ronit Belson remarked that, rather than falling into the category of DevOps products, the company’s platform more appropriately represents a disruptive innovation in the MarkOps space defined by the integration of marketing-related front-end application design with the Operations-related design of their platform’s IT infrastructure. Cloudmeter Stream integrates with Big Data platforms such as Splunk and GoodData allowing users to integrate petabytes of machine data with data selectively culled from the business rules specific to Cloudmeter’s user interface.

Cloudmeter Stream is complemented by Cloudmeter Insight, a SaaS application that transforms data captured by Cloudmeter Insight into visual representations that allow application owners to comprehensively understand end-user experiences of an application as represented below:

Cloudmeter Stream leverages widgets to allow users to customize reports and dashboards of their choosing. The result is an integrated view of an application’s backend and front-end user experience in ways that allow application owners to obtain a truly holistic picture of user experiences with their platforms. Today’s announcements point toward two exciting new releases into the application performance management space as Big Data begins to own up to its potential of delivering 360 degree views of user experiences with technology platforms. Cloudmeter’s customer base includes Netflix, SAP and Saks Fifth Avenue and 1-800-Flowers.

Puppet Labs Survey Confirms Emerging Popularity Of DevOps

Puppet Labs, a leader in the IT automation space, recently released its 2013 results for its annual DevOps survey. The results confirmed the arrival of DevOps as an emerging space within IT circles defined fundamentally by increased collaboration between development and operations resources and technologies. The survey was administered to over 4000 technology leaders in more than 90 countries spanning a wide range of businesses including startups, small companies and Fortune 500 enterprises.

Highlights of the survey include the following:

•Organizations that have implemented DevOps deploy code 30x faster than those which have not
•Code developed within a DevOps environment evinces 50% fewer failures and the ability to recover from failures by a factor of 12
•The rate at which code is successfully deployed increases the longer DevOps has been implemented within an organization

Organizations of all sizes are gradually implementing DevOps and attesting to dramatically improved results. Meanwhile, the HR space is witnessing a corresponding proliferation of DevOps-related career opportunities that are likely to continue to increase in the near future. In an interview with Cloud Computing Today, Puppet Labs CTO Nigel Kersten noted that DevOps technology requires culture change within software development departments in contradistinction to the facile implementation of technology for automating software deployments. As such, DevOps initiates a paradigm shift in software development that depends upon skill-sets that can both understand technical architecture as well as the complexities of operations and application ownership.

The survey and the corresponding detailed report suggest that DevOps has finally arrived. Puppet Labs, the survey’s author, stands at the forefront of the DevOps revolution by way of its IT automation products for streamlining the deployment of software and the provisioning of hardware. Puppet Labs technology allows system administrators to effectively manage increasingly heterogeneous IT environments featuring private clouds, virtual machines and public clouds, all of which collectively serve the needs of multiple and diverse constituencies. Puppet recently entered into a partnership with VMware that stands to position it strongly to rapidly accelerate its traction within the enterprise space.

Puppet Labs released an infographic that summarized the results of its DevOps survey as illustrated below:

Sauce Labs Brings Cloud To Automated Web Application Testing Including iOS

San Franciso-based Sauce Labs has been recommended by Adobe after the latter recently announced it was closing its BrowserLab testing platform for web applications. Adobe’s affirmation of Sauce Labs underscores its success with a use case for IaaS infrastructures different from the remote hosting of applications on outsourced infrastructure, namely, IaaS for testing and development purposes. Sauce Labs uses virtual machines to provide customers with over 100 combinations of browsers and platforms for the testing of web-based applications. The company’s SaaS software empowers enterprises to conduct detailed evaluations of the testing process by way of screenshots, video recordings and support for the Firebug plug-in for Mozilla Firefox.

Sauce Labs further enables customers to conduct testing for applications on browsers for mobile devices such as iPads, iPhones and Androids. The Sauce Labs virtualized testing infrastructure platform allows customers to perform A/B and multivariate testing across a wide range of browser-platform permutations while benefiting from the security and integrity of its virtualized environment. Because each virtual machine is newly spun up and created for each testing instance, customers can rest assured that their tests are free both of cookies and traces of their applications for other customers.

Sauce Labs recently announced the availability of a new service for testing iOS applications such as iPads and iPhones based on the open source project Appium. The Sauce Labs team rewrote the Python-based open source code for Appium into Node.js in order to render its iOS Appium testing platform accessible to a wider range of developers such as those who focus on JavaScript. Adam Christian, Vice President of Development at Sauce Labs, remarked on Appium on Sauce and the innovation of Sauce Labs’s cloud-based automated testing environment as follows:

As the world becomes more mobile and online interactions increasingly move to specialized applications, it’s increasingly important that these apps perform and meet consumer demands. Testing these apps has been a slow and difficult process, often done manually by teams using physical devices. Automated testing enabled by Appium represents the future, and Appium written in Node.js represents the best course toward ensuring the code continues to evolve as needs change.

The screenshot below illustrates a sample testing scenario for Everest, an iOS application tested using Appium on Sauce:

Appium reorients Sauce Labs squarely toward iOS mobile applications in a move that renders Sauce Labs the de facto cloud-based testing infrastructure for all platforms and programming languages. Its cloud-based platform for automated testing of machine and mobile apps is used by developers and enterprises alike, with representing one of Sauce Labs’s prominent customers and investors. Given how customers in agile development environments iteratively tweak and add to branches of existing code, Sauce Labs’s cloud-based testing technology lies at the heart of the DevOps and continuous integration movement in application lifecycle management. Users should expect cross-platform cloud-based testing to emerge as the standard for testing and QA in software development, particularly in the wake of the heterogeneity of browsers and platforms in the industry at large.

Revisiting 2013 Cloud Predictions By Serena Software

Serena Software, a leader in the ALM space according to Forrester Research, made three predictions about enterprise computing for 2013 at the turn of the New Year. I had a chance to revisit Serena’s predictions in collaboration with David Hurwitz, Serena’s SVP of Worldwide Marketing, now that 2013 is fully underway and the cloud landscape is witnessing its first major burst of new product releases and announcements. Serena’s predictions engage questions related to the emerging emphasis on building efficiency into deployment and application lifecycle management processes. Poised at the cusp of the DevOps revolution with 4000 enterprise customers that leverage Serena’s “orchestration” solutions for maximizing the business value of IT, Serena’s predictions engage the transformation of enterprise operations in relation to cloud-based SDLC infrastructures.

I asked David Hurwitz to elaborate on the first two of Serena’s three predictions as illustrated below:

Prediction #1: Large Enterprises Exploit the Cloud Primarily to Speed Development Cycles
The cloud has emerged as a dominant and powerful software development and deployment platform. In 2013, even the largest and most conservative enterprises will move to the cloud for the development phases of their overall delivery lifecycles. These were the companies that resisted cloud computing in the past. Their use of the cloud will be quite considered, as they will use a hybrid approach, exploiting the public cloud for testing and staging, but a private cloud or on-premise resources for production delivery. Keeping software production instances within private resources supports the enterprise need for security and control.

Arnal Dayaratna: Most cloud vendors, Amazon Web Services included, have yet to fully grapple with regulatory constraints on sensitive data such as PHI and government-related, classified data. Do you think the cloud industry is sufficiently prepared to deal with regulations regarding the transmission and storage of sensitive data in order to accommodate the movement of large, conservative enterprises to cloud-based development?

David Hurwitz: No, the cloud industry isn’t fully prepared and aren’t even close to changing perceptions about being prepared. The issue of data domiciling is going to take awhile for them to surmount, and it’s not just when dealing with classified data. Commercial data that includes European Union customers, for instance, has to be handled according to their mandates. Thus, the use of the cloud for financial services and public sector applications is going to lag, other than perhaps for pre-production environments.

Prediction #2: Help Desk Evolves from Technical Support to Business Support

In 2013, Help Desks will grow in importance from their present position as an IT function into a business support function. This “Business Desk” will handle both traditional technical support along with the non-traditional role of business support. The Business Support Desk will support customer-facing personnel with understanding new marketing offers, product offerings and the exploitation of other app-powered revenue features.

A companion trend is being driven by Bring Your Own Device (BYOD) activity. BYOD, for instance, has created more of a ‘do it yourself help desk’ for office workers since enterprise IT teams have their hands full with more pressing tasks and don’t always have the time for the everyday maintenance issues that BYOD creates.

Arnal Dayaratna: What factors suggest the evolution of help desks from technical support to business support? Has the success of specific software applications/platforms led you to this prediction? Or are there other reasons underlying the genesis of this observation?

David Hurwitz: The reason that help desks are evolving to include customer support functions is because online businesses are using applications to deliver revenue generating capabilities. This leads to a merging of the technical with the commercial. In other words, internal and external users increasingly need assistance in navigating new product offerings that are conveyed by apps, including getting assistance with technical issues in using those apps. Savvy online businesses only ask customers to reach out once for this assistance. Stagnant businesses ask them to reach out twice – once to tech support and again to customer support.

Prediction #3: The Cost of Rework Drives a Rethink of IT Processes
Research firm voke released data showing that 40-50 percent of the work IT performs falls in to the rework category, a huge tax on their ability to deliver new functionality. Further validating that claim, one large IT shop reported 17 percent of its total work had to be redone at some point. Better software delivery processes will combine in 2013 to help IT deliver software right the first time, with the large companion benefit that much more software can be delivered by development organizations that are dramatically less burdened by rework.

Serena’s predictions about the evolution of the help desk and the growing attention to the rework of IT processes are both profound and highly original.

As SaaS applications proliferate throughout enterprises, application support staff manage the responsibility not only of administering specific applications and providing help-desk type support, but also of defining and reinforcing best practices for software usage in the context of business problems. In the case of requirements management software that provides a unified online interface for documenting business requirements for SDLC, for example, application support staff respond to user questions about best practices for writing textual business requirements, defining process flows and creating use cases and test cases in addition to functionality questions about the software itself.

The line between help desk and business support has truly blurred as more and more applications precipitate a need for the help desk to evolve to optimally support revenue generating business units. Correspondingly, the rise of DevOps and SaaS applications for managing the intersection of development and operations permits (1) the development of metrics for tracking and optimizing IT operations; and (2) the measurement of data that allows for the tracking the achievement of those metrics. The industry should expect a revolution not only at the level of hosting but also in the arena of DevOps related to the increased availability of data about the efficiency of IT processes.