Datapipe’s Acquisition Of GoGrid Underscores The Industry Trend Of The Intersection Of Cloud And Big Data

Managed hybrid cloud IT solution provider Datapipe recently announced the acquisition of GoGrid, a leader in facilitating the effective operationalization of Big Data for cloud deployments. While GoGrid boasts over a decade of experience in managed cloud and dedicated cloud hosting, the company recently added a slew of Big Data offerings to its product line including NoSQL database product offerings and a 1-button deployment process, in addition to a partnership with Cloudera to accelerate Hadoop deployments for the enterprise. Robb Allen, CEO of Datapipe, commented on the significance of Datapipe’s acquisition of GoGrid as follows:

GoGrid has made it easy for companies to stand up Big Data solutions quickly. Datapipe customers will achieve significant value from the speed at which we can now create new Big Data projects in the cloud. This acquisition advances Datapipe’s strategy to help our enterprise clients architect, deploy and manage multi-cloud hybrid IT solutions.

Here, Allen remarks on the way in which GoGrid’s success in streamlining the implementation of Big Data solutions enhances Datapipe’s ability to offer enterprise customers Big Data solutions in conjunction with managed cloud hosting solutions. As such, Datapipe stands poised to consolidate its leadership in the space of cloud vendors offering Big Data solutions to enterprise customers given that cloud adoption has significantly outpaced Big Data in the enterprise to date. By acquiring GoGrid, Datapipe positions itself to offer its customers the limitless scalability of the cloud in addition to the infrastructure to store petabytes of data. The adoption of cloud-based big data solutions enables customers to take advantage of the potential for running analytics in parallel on transactional and non-transactional datasets alike to derive insights that draw upon the union of financial, operational, marketing, sales and third party data. As a result, Datapipe’s acquisition of GoGrid cements its already strong marketing positioning in the nascent but about to explode space marked by the intersection of cloud computing and Big Data.

Categories: Big Data, Datapipe, GoGrid

Conversation With John Fanelli, DataTorrent’s VP of Marketing, Regarding Analytics On Streaming Big Data

Cloud Computing Today recently spoke to John Fanelli, DataTorrent’s VP of Marketing, about Big Data, real-time analytics on Hadoop, DataTorrent RTS 2.0 and the challenges specific to performing analytics on streaming Big Data sets. Fanelli commented on the market reception of DataTorrent’s flagship product DataTorrent RTS 2.0 and the mainstream adoption of Big Data technologies.

1. Cloud Computing Today: Tell us about the market landscape for real-time analytics on streaming Big Data and describe DataTorrent’s positioning within that landscape. How do you see the market for real-time analytics evolving?

John Fanelli (DataTorrent): Data is being generated today in not only unprecedented volume and variety, but also velocity. Human created data is being surpassed by automatically generated data (sensor data, mobile devices and transaction data for example) at a very rapid pace. The term we use for this is fast big data. Fast big data can provide companies with valuable business insight, but only if they act on them immediately. If they don’t, the business value declines as the data ages.

As a result of this business opportunity, streaming analytics is rapidly becoming the norm as enterprises rush to deliver differentiated offerings to generate revenue or create operational automated efficiencies to save cost. But it’s not just fast big data alone; it’s big data in general. Organizations have plenty of big data already in their Enterprise Data Warehouse (EDW) that is used to enrich and provide greater context to fast big data. Some examples of data that drives business decisions include customer information, location and purchase history.

DataTorrent is leading the way in meeting customer requirements in this market by providing extremely scalable ingestion of data from many sources at different rates (“data in motion” and “data at rest”), combined with fault tolerant, high performing analytics; flexible Java-based action and alerting, delivered in an easy to use and operate product offering, DataTorrent RTS.

The market will continue to evolve toward making analytics easier to use across the enterprise (think non-IT users), cloud-based deployments and even pre-built blueprints for “enterprise configurable” applications.

2. Cloud Computing Today: How would you describe the reception of DataTorrent RTS 2.0? What do customers like most about the product?

John Fanelli (DataTorrent):Customer feedback DataTorrent RTS 2.0 has been phenomenal. There are many aspects of the product that are getting rave reviews. I have to call out that developers have reacted very positively to the Hadoop Distributed Hast Table (HDHT) feature as it provides them with a distributed, fault-tolerant “application scratchpad,” that doesn’t require any external technology or databases. Of course, the marquee features that have the data scientist community abuzz are Project DaVinci (visual streaming application builder) and Project Michelangelo (visual data dashboard). Both enable quick experimentation over real-time data and will emerge from Private Beta over the coming months.

3. Cloud Computing Today: How would you describe the differentiation of DataTorrent RTS from Apache Spark and Apache Storm?

John Fanelli (DataTorrent):DataTorrent provides a complete enterprise-grade solution, not just an event-streaming platform. DataTorrent RTS includes an enterprise-grade platform, a broad set of pre-built operators and visual development and visualization tools. Enterprises are looking for what DataTorrent calls a SHARPS platform. SHARPS is an acronym for Scalability, Highly Availability, Performance and Security. In each of the SHARPS categories, DataTorrent RTS is superior.

4. Cloud Computing Today: What challenges do you foresee for Big Data achieving mainstream adoption in 2015?

John Fanelli (DataTorrent): Fast big data is gaining momentum! Every day I speak with customers and prospects about their fast big data, the use-case requirements and the projected business impact. The biggest challenge they share with me is that they are looking to move faster than they are able due to existing projects and technical skills on their team. DataTorrent RTS’ ease of use and operator libraries supports almost any input/output source/sink and provides pre-built analytics modules to address those challenges.

Categories: Big Data, DataTorrent

Google Cloud Monitoring Achieves Beta Status Eight Months After Google’s Stackdriver Acquisition

Last week, Google released the Beta version of the Google Cloud Monitoring platform. Derived from its May 2014 acquisition of Stackdriver, Google Cloud Monitoring enables users to obtain insight into the performance of Google App Engine, Google Compute Engine, Cloud Pub/Sub, and Cloud SQL. As noted in a blog post by Google’s Dan Belcher, Google Cloud Monitoring delivers integrated monitoring of infrastructure, systems, uptime, trend analysis and alerts by way of a SaaS application. In addition, Google Cloud Monitoring enables users to create aggregations of select resources for monitoring and leverage dashboards that elaborate on metrics such as latency, capacity, uptime and other performance-related metrics. The platform also enables users to configure alerts specifying the achievement of designated metrics as well as endpoint checks notifying users about the lack of availability of APIs, web servers and other “internet-facing resources.” The beta release of Google Cloud Monitoring comes after months of preparation that culminated in the ability of the Stackdriver-based cloud monitoring platform to support the needs of Amazon Web Services customers as well as Google Cloud Platform customers alike. The release also follows soon upon Google’s announcement of details of Google Cloud Trace, a Beta platform that allows users to analyze remote procedure calls (RPCs) created by a Google App Engine-based application to understand latency distributions between different RPCs and “performance bottlenecks” more generally. The larger significance of the Beta release of Google Cloud Monitoring is that it delivers a monitoring tool that can monitor both Google Cloud Platform and Amazon Web Services infrastructures, whereas Amazon’s CloudWatch, for example, is dedicated solely to monitoring the AWS platform. For now, though, the product underscores Google’s commitment to building its IaaS infrastructure as exemplified by two Beta releases within the space of the early weeks of 2015.

Categories: Google, IaaS

Treasure Data Closes $15M In Series B Funding For Fully Managed, Cloud-Based Big Data Platform

This week, Treasure Data announced the finalization of $15M in Series B funding led by Scale Venture Partners. The funding will be used to accelerate the expansion of Treasure Data’s proprietary, cloud-based platform for acquiring, storing and analyzing massive amounts of data for use cases that span industries such as gaming, the internet of things and digital media. Treasure Data’s Big Data platform specializes in acquiring and processing streaming big data sets that are subsequently stored in its cloud-based infrastructure. Notable about the Treasure Data platform is that it offers customers a fully managed solution for storing streaming big data that can ingest billions of records per day, in a non-HDFS (Hadoop) format. Current customers include Equifax, Pebble, GREE, Wish.com and Pioneer, the last of which leverages the Treasure Data platform for automobile-related telematics use cases. In addition to Scale Venture Partners, all existing board members and their associated funds participated in the Series B capital raise, including Jerry Yang’s AME Venture Fund.

Categories: Big Data, Treasure Data

Neo Technology Raises $20M In Series C Funding For Its Neo4j Graph Database Technology

Neo Technology today announced the finalization of $20M in Series C funding. Today’s Series C funding raise was led by Creandum with additional participation from Dawn Capital. Existing investors Fidelity Growth Partners Europe, Sunstone Capital and Conor Venture Partners all participated in the round. The funding will be used to expand sales operations, enhance product development and build the open source community supporting the Neo4j platform and its attendant partner ecosystem. The funding comes hot on the heels of a year of explosive growth for Neo Technologies and its vendor-led open source graph database, Neo4j. Neo Technology’s CEO and co-founder Emil Eifrem remarked on the company’s growth as follows:

There are two strong forces propelling our growth: one is the overall market’s increasing adoption of graph databases in the enterprise. The other is proven market validation of Neo4j to support mission-critical operational applications across a wide range of industries and functions.

Eifrem notes how Neo Technology’s growth has been fueled by increasing enterprise-wide adoption of graph databases in conjunction with Neo4j’s consistent demonstration of its ability to support a variety of production-grade environments. In a phone interview with Cloud Computing Today, Eifrem further remarked how one of the challenges for Neo Technology consists of developing an incisive sales outreach strategy given that almost every enterprise could benefit from the adoption of graphing technologies. Eifrem elaborated that Neo Technology has chosen to tackle the challenge of prioritizing its sales outreach efforts by focusing on use cases that include data-driven recommendations (in e-commerce and social networking, for example), master data management, identity and access management, graph based search, network and IT operations, the internet of things and pricing, while nevertheless remaining open to other client requests and interests. Since the launch of Neo4j 2.0 last January, Neo4j has experienced over 500,000 downloads and boasts thousands of enterprise-grade deployments featuring organizations such as Walmart, eBay, Earthlink, CenturyLink, Pitney Bowes and Cisco. Based on its impressive record in 2014 and the explosive proliferation of use cases for graphing technology, 2015 could well represent an inflection point for Neo Technologies as it consolidates its leadership in the graph database space by using its additional funding to gain more market traction while continuing to educate the industry on the value proposition of adopting Neo4j.

Categories: Neo Technology, Venture Capital | Tags:

MapR Announces Selection By MediaHub Australia For Digital Archiving And Analytics

MapR recently announced that MediaHub Australia has deployed MapR to support its digital archive that serves 170+ broadcasters in Australia. MediaHub delivers digital content for broadcasters throughout Australia in conjunction with its strategic partner Contexti. Broadcasters provide MediaHub with segments of programs, live feeds and a schedule that outlines when the program in question should be delivered to its audiences. In addition to scheduled broadcasts, MediaHub offers streaming and video on demand services for a variety of devices. MediaHub’s digital archive automates the delivery of playout services for broadcasters and subsequently minimizes the need for manual intervention from archival specialists. MapR currently manages over 1 petabyte of content for the 170+ channels that it serves, although the size of its digital archive is expected to grow dramatically within the next two years. MapR’s Hadoop-based storage platform also provides an infrastructure that enables analytics on content consumption that help broadcasters make data-driven decisions about what content to air in the future and how to most effectively complement existing content. MediaHub’s usage of MapR illustrates a prominent use case for MapR, namely, the use of Hadoop for storing, delivering and running analytics on digital media. According to Simon Scott, Head of Technology at MediaHub, one of the key reasons MediaHub selected MapR as the big data platform for its digital archive concerned its ability to support commodity hardware.

Categories: Big Data, Hadoop, MapR

Notes On Cloud Security: An Interview With Krishna Narayanaswamy, Chief Data Scientist At Netskope

Cloud Computing Today recently spoke to Krishna Narayanaswamy, Chief Data Scientist at Netskope, about the company’s positioning in the cloud security space in addition to his predictions for the cloud security landscape in 2015. Part of the impetus for the conversation was to understand how cloud security involves more than the monitoring of real-time system and application behavior that suggests fraudulent or abnormal activity by implementing proactive actions based on predictive analytics and machine learning. Krishna responded by elaborating on Netskope’s ability to develop security policies in conjunction with its analytics as well as the BYOD phenomenon and its attendant challenges for cloud security. As for cloud security predictions for 2015, Krishna noted an expected increase in data breaches, the use of cloud apps as a vector for the spread of malware and the way in which cloud security for apps sanctioned by enterprise IT policies will need to address the security of data at rest as well as data in motion.

1. Cloud Computing Today: How do you understand Netskope’s differentiation within the cloud security space?

K. Narayanaswamy (Netskope): The workforce of nearly every company today uses cloud apps, and as adoption has become common, it’s more likely that people will share sensitive business information via those apps. Our research shows that the average organization today has 613 cloud apps in use, 88 percent of which are not enterprise ready. Today’s companies not only need a way to discover the cloud apps that are used by their workforce — sanctioned or not — they need to be aware of the activities that happen within those apps, and set policies to prevent the activities that put confidential and sensitive information at risk.

Netskope is the only cloud app security and enablement company that offers real-time analysis and policy creation to prevent unwanted behavior, and the ability to monitor ALL cloud apps (not just those within IT’s purview). Secure cloud enablement is no longer a “one-size fits all” solution, and through Netskope’s Active Encryption, any user can tailor the creation of policies to fit their needs — large healthcare companies are going to focus more in HIPAA compliance than a music and entertainment company, for example.

2. Cloud Computing Today: What is the fundamental problem of cloud security as you see it from a business perspective?

K. Narayanaswamy (Netskope): Cloud apps are the norm in the workforce, with the vast majority being brought in by users unknown to IT, known as “shadow IT.” Today, IT grossly underestimates the number of cloud apps in use by their workforce, which presents significant data security and compliance risks. With the BYOD trend gaining momentum with no end in sight, more apps are guaranteed to make their way into organizations, and access corporate information than ever before, and CISOs are scrambling for a solution. The underlying problem with cloud app security has been that IT’s been forced to make a stark black and white decision — either block all cloud apps at the network perimeter, or let secure corporate data run rampant in unsanctioned cloud apps. Today, IT can can get the insights they need about usage, users, and activities done within apps to that they can promote secure usage. Rather than clamping down and blocking all apps, IT can embrace BYOD, rid negative connotation with the term “shadow IT,” and formulate security policies using a highly scalable approach that adapts to the cloud app economy.

3. Cloud Computing Today: What are your Cloud Security Predictions for 2015?

K. Narayanaswamy (Netskope): In 2015 we will see continued growth and adoption of cloud services in enterprises. The adoption will fall under IT sanctioned apps as well as lines of business driven procurement. The implications for security solutions are:

•Enterprise IT sanctioned cloud apps will be deployed in production only in conjunction with a suitable security solution to secure the enterprise sensitive data as it migrates to the cloud. The security solutions will cover both data at rest as well as data in motion to the cloud apps.
•Cloud security solutions will be deployed to monitor and safely enable the use of non IT sanctioned cloud applications
•We will see increased use of SSO technologies and MFA policies to access cloud applications
•Data breaches in cloud apps will start becoming prominent. App vulnerabilities will continue to be a major threat vector. Keep an eye for open source related vulnerabilities. Cloud apps rely heavily on open source components. Vulnerabilities in open source packages will imply threats to cloud apps if not addressed in a timely manner. Vulnerable cloud apps in turn lead to data breaches
•Malware – cloud apps will become a significant channel for distribution of malware. Existing URL filtering technologies do not adequately address this threat vector.
•Data driven security will become mainstream. Cloud security solutions will generate metadata that is used for detecting anomalous user behaviors and data theft.

Categories: Netskope | Tags: , , , ,

Create a free website or blog at WordPress.com. The Adventure Journal Theme.