Dimension Data Collaborates With ASO To Deliver GPS-based Real-Time Analytics On Tour De France

In July, Dimension Data announced details of its data analytics for the Tour de France in the form of data regarding sprint speeds, average speeds across the 21 stages, distance covered and climbs associated with the lowest average speed. The data was made available by means of a partnership with the Amaury Sport Organization and the 22 teams that participated in this year’s Tour de France. Dimension Data and the ASO organized the implementation of live tracking sensors under the seats of each cyclist. Subsequent to the implementation of the live tracker, Dimension Data took responsibility for the aggregation, cleansing and analytics of the data within its big data analytics platform. Whereas previously cycling fans were limited in their ability to follow the Tour de France by means of live television content, Dimension Data’s real-time big data analytics and data visualization platform allowed users to track their favorite rider or riders, obtain aggregated statistics about groups of riders and review real-time data about data points such as cyclist speed, elevation climbed and distance traversed. The Dimension Data cloud delivered real-time analytics on rider position, speed and their relationship to other groups of cyclists by means of an analytic platform that used MongoDB and SQL Server for data aggregation and IBM Streams for real-time analytics on streaming data feeds. The 198 riders in the 2015 Tour de France generated approximately 75 million GPS readings that were tracked on the Dimension Data analytic cloud and transformed into a Beta live tracking website that gave fans and the press an unprecedented level of detail and granularity into the unfolding of the tournament. Going forward, Dimension Data plans to deepen its partnership with the ASO to deliver increasingly enhanced end user experiences to fans and followers of the Tour de France, over and beyond the delivery of real-time analytics. Now that the ASO and Dimension Data have succeeded in building the infrastructure that can deliver analytics that specify the highest recorded sprint in 2015 was 78.48 km/hr or that the average speed of all cyclists across all 21 stages was 38.34 km/hr, Dimension Data plans to build progressively richer end user portals, websites and dashboards to help fans experience and understand the tournament in the years to come.

Advertisements

Q&A With Umesh Mahajan, CEO of Avi Networks, Regarding Its Cloud Application Delivery Platform

Cloud Computing Today recently had the opportunity to speak with Umesh Mahajan, CEO of Avi Networks, about the company’s cloud application delivery platform (CADP). Avi Networks uses analytics to help ensure consistent end-user experiences of applications by way of a platform that delivers the application delivery and load balancing technologies used by internet giants such as Facebook and Yahoo to the enterprise. The Avi Networks HYDRA-powered Cloud Application Delivery platform empowers enterprises to optimize load balancing and application delivery within private cloud deployments as well as hybrid cloud infrastructures. Cloud Computing Today engaged Avi Network’s CEO Umesh Mahajan about the cloud application delivery space, the need for hyperscale application delivery platforms within the enterprise and HYDRA’s analytics capabilities.

Cloud Computing Today: How do you envision the cloud application delivery space? What is the key differentiation of Avi Networks within the cloud application delivery space?

Umesh Mahajan (CEO, Avi Networks): Over the past decade, requirements for application delivery have gone through a complete transformation driven by changing modes of application consumption and an evolution of application architectures. In an attempt to keep up, traditional ADC vendors have brought forth incremental improvements in performance, scale, security and availability. I call this device-centric innovation, that is, making existing appliances better, bigger, faster. However, today we are faced with mega trends such as hybrid cloud adoption and a mobile-first access policy in most large enterprises. This has resulted in the emergence of the software-defined data center, which legacy networking technologies are hard-pressed to support.

In this new world, Avi Networks bring three key pieces of innovation. First, we use real-time analytics that tracks and uses a number of telemetric data to dynamically adapt the application delivery services being provided. This enables IT to guarantee application SLAs and accounts for sudden spikes in user traffic. Second, we have developed the industry’s first distributed load-balancing architecture that’s based on SDN principles with a clean separation of control from data plane. This provides a dramatic simplification of network operations via a single centralized controller while the data plane services can span, serve and scale multiple data center and cloud locations. And finally, as opposed to a “pay-upfront” model that exists today with appliance based application delivery solutions, we price our software-only solution based on the actual network services our customers consume.

Cloud Computing Today: Avi Networks attempts to bring the application delivery and load balancing technologies enjoyed by the likes of Facebook and Google to the enterprise. Describe the business need to bring the application delivery capabilities of companies such as Google and Facebook to enterprises that have vastly different workloads and business needs than internet companies that ingest petabytes of data daily.

Umesh Mahajan (CEO, Avi Networks): Companies such as Google, NetFlix and Facebook who deal in hyperscale environments are important because they have redefined the online end-user experience. It amazes me how a Google app can almost predict my next move and suggest ways to auto-complete my text. The other important benchmark hypersale vendors have established is the expectation that their applications will never, ever go down. These are realms that most enterprise apps can dare dream about today.

We build Avi Networks with the goal of enabling the same level of application experience for enterprise apps enjoyed by Web 2.0 and hyperscale users. We’ve also taken inspiration from and have integrated ways to make data centers more efficient through software that can run on any hardware processor and appliance. Of course, the use of real-time analytics to drive smarter even predictive load balancing is another hyperscale innovation that we’re using. A relevant example is the Facebook Autoscale project that the company designed to enable energy- and CPU-efficient load balancing. Finally, these companies have shown the power of automation that drives a “self-service” and agile operating model for the network admins, which also distinguishes Avi from legacy ADC vendors.

Cloud Computing Today: What is most notable about HYDRA’s built-in analytics capabilities?

Umesh Mahajan (CEO, Avi Networks): At the highest level, we’re committed to making the job of the network administrators dead simple so that they can focus on strategic initiatives instead of simply keeping the lights on and troubleshooting application performance issues. It’s very common that the networking teams get the brunt of the blame for poor application performance, irrespective of the real reason for the issue. That’s where Avi comes in by arming network administrators with real-time data and troubleshooting capabilities to shorten what some people refer to as the “mean time to innocence.”

But what’s truly notable about the HYDRA architecture is that it is a true software-defined network model, composed of a centralized controller and a distributed, scalable data plane (called service engines) that can be co-located with the applications within and across the cloud. This tight alignment not only enables service engines to serve as micro load-balancers but also as integrated data collectors that pick up the ambient statistics about every user-to-application transaction. This way they become hundreds of eyes in the data center, ever vigilant. This data is streamlined via reduction filters and compression techniques and then sent to a continuous data store within Avi Controller. The output is very granular and gives real-time insights about the end-to-end timing of application transactions, application health score, client logs, and client insights that can predict and proactively prevent any degradation in application performance – at any scale, any location for every user.

Bio: Umesh Mahajan, Chief Executive Officer & Co-founder of Avi Networks

A seasoned executive and entrepreneur with 25+ years of experience in tech industry, Umesh has helped develop the vision, strategy, and execution plan for several innovative technology products. Before co-founding Avi Networks, Umesh was the Vice President / General Manager for a $2B Data Center switching business at Cisco where he led engineering, product management, marketing and operations for Nexus 7000, MDS switching products, and the NX-OS operating system.

Prior to his work at Cisco, Umesh led the software team at Andiamo in architecting and delivering SAN-OS. Umesh has a Master of Science degree in computer science from Duke, a Bachelor of Science degree from IIT Delhi, and has 29 patents to date.

ExtraHop And Sumo Logic Collaborate To Deliver IT Insights That Combine Wire And Log Data

Wire data analytics leader ExtraHop and machine data analytics vendor Sumo Logic recently announced a partnership whereby ExtraHop’s wire data will complement machine data aggregated by Sumo Logic’s cloud platform. The partnership brings together ExtraHop’s leadership in wire data analytics and Sumo Logic’s recognized machine data analytics platform to create a unified framework for event detection and management. As a result of the collaboration, ExtraHop’s Open Data Stream delivers real-time, streaming feeds of wire data to Sumo Logic’s platform for aggregating and analyzing machine data. Meanwhile, Sumo Logic customers enjoy access to a more comprehensive universe of data about an IT infrastructure and its constituent set of applications and networking topology. ExtraHop’s real-time wire data enhances Sumo Logic’s cloud-based machine data platform with L2-L7 wire data as illustrated below:

The ExtraHop dashboard depicted above elaborates the ability of the ExtraHop platform to analyze wire data that contains insights regarding application performance, security and infrastructure availability. The Sumo Logic dashboard shows the integration of ExtraHop’s wire data into its platform and its corresponding user interface. ExtraHop’s partnership with SumoLogic delivers real-time data feeds to Sumo Logic’s cloud platform that are ingested into Sumo Logic’s cloud platform for the purpose of delivering actionable business intelligence about the health of IT infrastructures based on the aggregation of log and wire data. The graphics differentially illustrate how ExtraHop’s wire data enriches Sumo Logic’s aggregation of machine data by providing it with an additional dataset that Sumo Logic’s cloud platform can integrate into its massive, multi-tenant unstructured cloud database built on Amazon Web Services to deliver advanced analytics and data visualization regarding the detection of infrastructure and application related events.

Mark Musselman, Vice President, Strategic Alliances at Sumo Logic, remarked on the significance of the partnership between ExtraHop and Sumo Logic as follows:

Adding ExtraHop data as a new source into the Sumo Logic service for proactive analysis against other feeds enables IT teams to gain deeper performance, security and business insights from across IT infrastructure. Sumo Logic’s cloud-native architecture means the service serves an aggregation point for diverse data sources. The result is an IT team that acts on timely information from within their infrastructure – even information they did not know to ask for. A critical piece to the puzzle lies in Sumo Logic’s Anomaly Detection, a proprietary capability that delivers insight from patterns in data and insights beyond what IT teams themselves know to query.

Here, Musselman comments on the way in which ExtraHop’s data facilitates “deeper performance, security and business insights” by serving as an additional data source that enables advanced analytics about enterprise IT architectures. The integrated data repository marked by the confluence of ExtraHop wire data and Sumo Logic log data leverages Sumo Logic’s proprietary advanced analytics and machine learning technology to deliver notifications about events of interest within the infrastructure while iteratively refining those same alerts in correspondence with the actions initiated by the recipients of those same notifications. In all, the partnership between ExtraHop and Sumo Logic underscores the significance of wire data for analytics related to machine data analytics and the internet of things while concurrently enriching the capabilities of Sumo Logic’s cloud-based log management and analytics platform. With ExtraHop’s real-time wire data now streaming into the Sumo Logic platform, the case for a Sumo Logic IPO grows stronger while ExtraHop similarly benefits from enumerating the value of its wire data aggregation and analytics technology.

Base Enhances Sales Productivity Platform With Real-Time Analytics And Rich Data Visualization

Base, the CRM that leverages real-time data and analytics, recently announced the release of a bevy of new features and functionality that brings real-time, Big Data analytics to cloud-based sales productivity management. Base’s proprietary technology aggregates data from sources such as phone calls, in person meetings, social network-based prospects and news feeds and subsequently produces real-time notifications to sales professionals. As a result, sales teams can minimize their manual input of sales-related data and instead take advantage of the analytic and data visualization capabilities of the Base platform. The Base platform testifies to a qualitative shift within the CRM space marked by the delivery of enhanced automation to sales operations workflows resulting from the conjunction of real-time data, predictive analytics and data visualization. Uzi Shmilovici, CEO of Base, remarked on the positioning of Base within the larger CRM landscape as follows:

Base picks up where other CRMs have left off. Until now, legacy cloud Sales and CRM products like Salesforce have been accepted as ‘the norm’ by the enterprise market. However, recent advancements in big data, mobility and real-time computing reveal a need for a new generation of intelligent sales software that offers flexibility, visibility, and real-time functionality. If you’re using outdated technology that cannot adapt to the advanced needs of modern day sales teams, your competition will crush you.

Here, Shmilovici comments on the way in which big data, real-time analytics and the proliferation of mobile devices have precipitated the creation of a new class of sales applications that outstrip the functionality of “legacy cloud Sales and CRM products like Salesforce.” In a phone interview with Cloud Computing Today, Shmilovici elaborated on the ability of the Base platform to aggregate disparate data sources to produce rich, multivalent profiles of sales prospects that augment the ability of sales teams to convert leads into qualified sales. Base’s ability to enhance sales operations by means of data-driven analytics are illustrated by the screenshot below:

The graphic above illustrates the platform’s ability to track sales conversions at the level of individual sales professionals as well as sales managers or owners within a team. VPs of Sales can customize analytics regarding the progress of their teams to enable enhanced talent and performance management in addition to gaining greater visibility as to where the market poses its stiffest challenges. More importantly, however, Base delivers a veritable library of customized analytics that illustrates a prominent use case for the convergence of cloud computing, real-time analytics and Big Data technologies. As such, the success of the platform will depend on its ability to continue enhancing its algorithms and analytics while concurrently enriching the user experience that remains integral to the daily experience of sales teams.

Pivotal Releases Pivotal Big Data Suite With Pricing Per Core And Annual Subscription

Not to be outdone by the slew of product and price announcements from Google, Amazon Web Services and Microsoft over the past week, EMC-VMware spinoff Pivotal announced a new product offering branded the Pivotal Big Data Suite on Wednesday. The platform delivers Pivotal Greenplum Database, Pivotal GemFire, Pivotal SQLFire, Pivotal GemFire XD and Pivotal HAWQ, in addition to unlimited use of Pivotal’s Hadoop distribution Pivotal HD. Because the Pivotal Big Data Suite is priced on the basis of an annual subscription for all software and services, in addition to per core pricing for computing resources, customers need not fear additional fees related to software licensing or customer support over and beyond the subscription price. Moreover, customers essentially have access to a commercial-grade Hadoop distribution for free as part of the subscription price. Pivotal compares the Big Data Suite to a “swiss army knife for Big Data” that enables customers to “use whatever tool is right for your problem, for the same price.” Customers have access to products such as Greenplum’s massively parallel processing (MPP) architecture-based data warehouse, GemFire XD’s in-memory distributed Big data store for real-time analytics with a low latency SQL interface and HAWQ’s SQL-querying ability for Hadoop. Taken together, the Pivotal Big Data Suite edges towards the realization of Pivotal One, an integrated solution that performs Big Data management and analytics for ecosystems of applications, real-time data feeds and devices that can serve the data needs of the internet of things, amongst other use cases. More importantly, the Pivotal Big Data Suite represents the most systematic attempt to productize Big Data solutions in the industry at large, even if it is composed of an assemblage of heterogeneous products under one roof. The combination of access to a commercial grade Hadoop distribution (Pivotal HD), a data warehouse designed to store petabytes of data (Pivotal Greenplum) and closed loop real-time analytics solutions (Pivotal GemFire XD) within a unified product offering available via an annual subscription and per core pricing constitutes an offer not easy to refuse for anyone seriously interested in exploring the capabilities of Big Data. The bottom line is that Pivotal continues to push the envelope with respect to Big Data technologies although it now stands to face the challenge posed by cash flush Cloudera, which recently finalized $900M in funding and a strategic and financial partnership with Intel.

Neo4j Adopted By Retail Giants eBay and Walmart For Real-Time, E-commerce Analytics

Neo Technology recently announced that retail giants such as eBay and Walmart are using graph database Neo4j in production-grade applications that improve their operations and marketing analytics. In a recently published case study, Neo Technology revealed how eBay’s e-commerce technology platform acquisition, Shutl, leverages Neo4j to expedite delivery to the point where customers can enjoy same day delivery in select cases. Shutl constitutes the technology platform that undergirds eBay Now, a service that delivers products in 1-2 hours from local stores by means of relationships between couriers and stores. eBay decided to make the transition from MySQL to Neo4j because:

Its previous MySQL solution was too slow and complex to maintain, and the queries used to calculate the best route additionally took too long. The eBay development team knew that a graph database could be added to the existing SOA and services structure to solve the performance and scalability challenges. The team turned to Neo4j as the best possible solution on the market.

According to Volker Pacher, Senior Developer at eBay, eBay found that Neo4j enabled dramatic improvements in its computational and querying ability:

We found Neo4j to be literally thousands of times faster than our prior MySQL solution, with queries that require 10-100 times less code. Today, Neo4j provides eBay with functionality that was previously impossible.

eBay’s current ecommerce technology platform leverages Ruby, Sinatra, MongoDB, and Neo4j. Importantly, queries “remain localized to their respective portions on the graph” in order to ensure scalability and performance. Walmart, meanwhile, uses Neo4j to understand the online habits of its shoppers in order to deliver more relevant real-time product recommendations for their online shoppers. Neo4j’s adoption by eBay and Walmart symptomatically illustrates how graph databases are disrupting the nature of real-time analytics, a trend further underscored by Pivotal HD 2.0’s integration of GraphLab into its offerings, and the use of graphing technologies by startups such as Aorato.