Big Data

Datapipe’s Acquisition Of GoGrid Underscores The Industry Trend Of The Intersection Of Cloud And Big Data

Managed hybrid cloud IT solution provider Datapipe recently announced the acquisition of GoGrid, a leader in facilitating the effective operationalization of Big Data for cloud deployments. While GoGrid boasts over a decade of experience in managed cloud and dedicated cloud hosting, the company recently added a slew of Big Data offerings to its product line including NoSQL database product offerings and a 1-button deployment process, in addition to a partnership with Cloudera to accelerate Hadoop deployments for the enterprise. Robb Allen, CEO of Datapipe, commented on the significance of Datapipe’s acquisition of GoGrid as follows:

GoGrid has made it easy for companies to stand up Big Data solutions quickly. Datapipe customers will achieve significant value from the speed at which we can now create new Big Data projects in the cloud. This acquisition advances Datapipe’s strategy to help our enterprise clients architect, deploy and manage multi-cloud hybrid IT solutions.

Here, Allen remarks on the way in which GoGrid’s success in streamlining the implementation of Big Data solutions enhances Datapipe’s ability to offer enterprise customers Big Data solutions in conjunction with managed cloud hosting solutions. As such, Datapipe stands poised to consolidate its leadership in the space of cloud vendors offering Big Data solutions to enterprise customers given that cloud adoption has significantly outpaced Big Data in the enterprise to date. By acquiring GoGrid, Datapipe positions itself to offer its customers the limitless scalability of the cloud in addition to the infrastructure to store petabytes of data. The adoption of cloud-based big data solutions enables customers to take advantage of the potential for running analytics in parallel on transactional and non-transactional datasets alike to derive insights that draw upon the union of financial, operational, marketing, sales and third party data. As a result, Datapipe’s acquisition of GoGrid cements its already strong marketing positioning in the nascent but about to explode space marked by the intersection of cloud computing and Big Data.

Categories: Big Data, Datapipe, GoGrid

Conversation With John Fanelli, DataTorrent’s VP of Marketing, Regarding Analytics On Streaming Big Data

Cloud Computing Today recently spoke to John Fanelli, DataTorrent’s VP of Marketing, about Big Data, real-time analytics on Hadoop, DataTorrent RTS 2.0 and the challenges specific to performing analytics on streaming Big Data sets. Fanelli commented on the market reception of DataTorrent’s flagship product DataTorrent RTS 2.0 and the mainstream adoption of Big Data technologies.

1. Cloud Computing Today: Tell us about the market landscape for real-time analytics on streaming Big Data and describe DataTorrent’s positioning within that landscape. How do you see the market for real-time analytics evolving?

John Fanelli (DataTorrent): Data is being generated today in not only unprecedented volume and variety, but also velocity. Human created data is being surpassed by automatically generated data (sensor data, mobile devices and transaction data for example) at a very rapid pace. The term we use for this is fast big data. Fast big data can provide companies with valuable business insight, but only if they act on them immediately. If they don’t, the business value declines as the data ages.

As a result of this business opportunity, streaming analytics is rapidly becoming the norm as enterprises rush to deliver differentiated offerings to generate revenue or create operational automated efficiencies to save cost. But it’s not just fast big data alone; it’s big data in general. Organizations have plenty of big data already in their Enterprise Data Warehouse (EDW) that is used to enrich and provide greater context to fast big data. Some examples of data that drives business decisions include customer information, location and purchase history.

DataTorrent is leading the way in meeting customer requirements in this market by providing extremely scalable ingestion of data from many sources at different rates (“data in motion” and “data at rest”), combined with fault tolerant, high performing analytics; flexible Java-based action and alerting, delivered in an easy to use and operate product offering, DataTorrent RTS.

The market will continue to evolve toward making analytics easier to use across the enterprise (think non-IT users), cloud-based deployments and even pre-built blueprints for “enterprise configurable” applications.

2. Cloud Computing Today: How would you describe the reception of DataTorrent RTS 2.0? What do customers like most about the product?

John Fanelli (DataTorrent):Customer feedback DataTorrent RTS 2.0 has been phenomenal. There are many aspects of the product that are getting rave reviews. I have to call out that developers have reacted very positively to the Hadoop Distributed Hast Table (HDHT) feature as it provides them with a distributed, fault-tolerant “application scratchpad,” that doesn’t require any external technology or databases. Of course, the marquee features that have the data scientist community abuzz are Project DaVinci (visual streaming application builder) and Project Michelangelo (visual data dashboard). Both enable quick experimentation over real-time data and will emerge from Private Beta over the coming months.

3. Cloud Computing Today: How would you describe the differentiation of DataTorrent RTS from Apache Spark and Apache Storm?

John Fanelli (DataTorrent):DataTorrent provides a complete enterprise-grade solution, not just an event-streaming platform. DataTorrent RTS includes an enterprise-grade platform, a broad set of pre-built operators and visual development and visualization tools. Enterprises are looking for what DataTorrent calls a SHARPS platform. SHARPS is an acronym for Scalability, Highly Availability, Performance and Security. In each of the SHARPS categories, DataTorrent RTS is superior.

4. Cloud Computing Today: What challenges do you foresee for Big Data achieving mainstream adoption in 2015?

John Fanelli (DataTorrent): Fast big data is gaining momentum! Every day I speak with customers and prospects about their fast big data, the use-case requirements and the projected business impact. The biggest challenge they share with me is that they are looking to move faster than they are able due to existing projects and technical skills on their team. DataTorrent RTS’ ease of use and operator libraries supports almost any input/output source/sink and provides pre-built analytics modules to address those challenges.

Categories: Big Data, DataTorrent

Treasure Data Closes $15M In Series B Funding For Fully Managed, Cloud-Based Big Data Platform

This week, Treasure Data announced the finalization of $15M in Series B funding led by Scale Venture Partners. The funding will be used to accelerate the expansion of Treasure Data’s proprietary, cloud-based platform for acquiring, storing and analyzing massive amounts of data for use cases that span industries such as gaming, the internet of things and digital media. Treasure Data’s Big Data platform specializes in acquiring and processing streaming big data sets that are subsequently stored in its cloud-based infrastructure. Notable about the Treasure Data platform is that it offers customers a fully managed solution for storing streaming big data that can ingest billions of records per day, in a non-HDFS (Hadoop) format. Current customers include Equifax, Pebble, GREE, Wish.com and Pioneer, the last of which leverages the Treasure Data platform for automobile-related telematics use cases. In addition to Scale Venture Partners, all existing board members and their associated funds participated in the Series B capital raise, including Jerry Yang’s AME Venture Fund.

Categories: Big Data, Treasure Data

MapR Announces Selection By MediaHub Australia For Digital Archiving And Analytics

MapR recently announced that MediaHub Australia has deployed MapR to support its digital archive that serves 170+ broadcasters in Australia. MediaHub delivers digital content for broadcasters throughout Australia in conjunction with its strategic partner Contexti. Broadcasters provide MediaHub with segments of programs, live feeds and a schedule that outlines when the program in question should be delivered to its audiences. In addition to scheduled broadcasts, MediaHub offers streaming and video on demand services for a variety of devices. MediaHub’s digital archive automates the delivery of playout services for broadcasters and subsequently minimizes the need for manual intervention from archival specialists. MapR currently manages over 1 petabyte of content for the 170+ channels that it serves, although the size of its digital archive is expected to grow dramatically within the next two years. MapR’s Hadoop-based storage platform also provides an infrastructure that enables analytics on content consumption that help broadcasters make data-driven decisions about what content to air in the future and how to most effectively complement existing content. MediaHub’s usage of MapR illustrates a prominent use case for MapR, namely, the use of Hadoop for storing, delivering and running analytics on digital media. According to Simon Scott, Head of Technology at MediaHub, one of the key reasons MediaHub selected MapR as the big data platform for its digital archive concerned its ability to support commodity hardware.

Categories: Big Data, Hadoop, MapR

Basho Technologies Raises $25M In Series G Funding And Announces Record Growth In 2014

Basho Technologies, creator of the Riak NoSQL key-value database platform, today announced the finalization of $25M in Series G funding led by existing investor Georgetown Partners. In addition to the funding news, Basho revealed details of record growth including sequential growth of 62 percent and 116 percent in Q3 and Q4 of 2014 respectively. 2014 represented a landmark year for Basho given that it shipped Riak 2.0, Riak CS 1.5 and appointed Adam Wray, former CEO of Tier 3, as CEO. In the same year, Basho replaced Oracle as the database platform for the National Health Service of UK and deepened its relationship with The Weather Company as noted below by Bryson Koehler, executive vice president and CITO for The Weather Company:

The amount of data we collect from satellites, radars, forecast models, users and weather stations worldwide is over 20TB each day and growing quickly. This data helps us deliver the world’s most accurate weather forecast as well as deliver more severe weather alerts than anyone else, so it is absolutely mission critical and has to be available all of the time. Riak Enterprise gives us the flexibility and reliability that we depend on to enable over 100,000 transactions a second with sub 20ms latency on a global basis.

Here, Koehler remarks on the ability of Riak Enterprise to handle “over 100,000 transactions a second” with latencies less than 20 ms. Importantly, The Weather Company’s daily data collection rate of 20 TB a day illustrates the massive volumes of data that Enterprise Riak can aggregate for archival and analytic use cases. As told to Cloud Computing Today in an interview with Basho CEO Adam Wray, Riak also gained traction in verticals such as gaming, healthcare and financial services in 2014 with much of its uptake propelled by trends in the technology industry marked by increased adoption of Big Data, distributed systems and applications in the cloud computing space and the growth of the internet of things vertical. Wray further remarked that Riak stands strongly positioned to reap the benefits of increased stakeholder awareness about the value of key-value stores and concepts such as eventual consistency. Today’s capital raise brings the total funding raised by Basho to $65M. With an extra $25M in the bank and an enviable roster of enterprise customers out of the gate, the NoSQL space should expect Basho to build steadily upon its success in 2014 by gaining even more market traction amongst Fortune 50 customers and staking out its positioning amongst the likes of MongoDB, MarkLogic, Couchbase and DataStax, with a particular focus on sharpening its differentiation in comparison to other key-value store databases such as Couchbase and DataStax.

Categories: Basho Technologies, Big Data, NoSQL | Tags: ,

Internet Of Things Predictions For 2015 From ParStream

The following represent Internet of Things Predictions from ParStream, the company behind the analytics platform built for the internet of things.

1. The Rise of the Chief-IoT-Officer: In the not too distant past, there was an emerging technology trend called “eBusiness”. Many CEO’s wanted to accelerate the adoption of eBusiness across various corporate functions, so they appointed a change leader often known as the “VP of eBusiness,” who partnered with functional leaders to help propagate and integrate eBusiness processes and technologies within legacy operations. IoT represents a similar transformational opportunity. As CEO’s start examining the implications of IoT for their business strategy, there will be a push to drive change and move forward faster. A new leader, called the Chief IoT Officer, will emerge as an internal champion to help corporate functions identify the possibilities and accelerate adoption of IoT on a wider scale.

2. Analytics Will Be the #1 Priority for IoT Initiatives: 2014 was about sensors and devices. The initial objective of many IoT projects was about simply placing sensors on critical assets such as aircraft engines, cell phone towers, cargo containers, and more to start collecting data from real-time events. Early IoT pilots demonstrated the wealth of information made possible by sensors and connections. 2015 will be about value. The attention will quickly shift from simply “enabling IoT” to truly “generating benefits from IoT”. Timely analytics is key in gaining actionable insights from data, and hence, a prerequisite for realizing the full potential of IoT. To drive more business value from IoT, companies will analyze more real-time data and implement new, innovative ways of delivering analytics to the “edge” or source of data.

3. IoT Platform to IoT Platform Integration Will Drive Relevance: Forrester recently proclaimed “IoT software platforms will become the rage in 2015”. Indeed, many IoT software companies are thinking “platform” rather than just “modules” to help deliver something closer to a “whole offer” for customers. However, an IoT platform’s real value will be driven by its integration with other IoT platforms. The reality is that there is no single, end-to-end IoT platform, which can deliver device management, data aggregation, analytics, visualization, etc. for the breadth of potential IoT use-cases. Hence, the power and value proposition of an IoT platform will be driven by its connection and integration with other complementary IoT platforms.

4. Industrial/Enterprise IoT Will Take Center Stage in the Media Spotlight: Driven by well-publicized acquisitions (e.g. Google/Nest) and high-profile new products (e.g. Fitbit, Apple Watch, etc.), consumer IoT has received a disproportionate amount of media attention compared to industrial IoT. While consumer IoT will eventually be a huge market, the hype greatly outweighs the near-term reality with respect to adoption. However, the tide is turning and industrial IoT will take the spotlight in 2015 as the media starts to more frequently cover the massive opportunity and traction of enterprise IoT in driving efficiency and creating new business models (e.g. Harvard Business Review’s cover story on IoT in their November 2014 issue).

Categories: Big Data, ParStream | Tags: ,

Hortonworks and New Relic Post Impressive Gains In First Day Of IPO Trading

On Friday December 12, Hortonworks finished its first day of trading with a share price of $26.48, roughly 65% more than the IPO price of $16 per share. Hortonworks plans to raise $100M by means of 6,250,000 publicly available shares. Friday’s impressive showing bodes well for the Hadoop infrastructure and analytics market in 2015, particularly given that Hortonworks competitors are gearing up to execute IPOs in 2015 or shortly thereafter. Cloud monitoring and analytics vendor New Relic similarly gained in its first day of trading by rising 48% from $23 per share to $33.02 by the end of the day. The results represented a huge coup for venture capitalist Peter Fenton of Benchmark Capital, who serves on the board of directors of both companies. Whereas Hortonworks raised $100M in its IPO, New Relic raised $115M. The real winner in both of these IPOs, however, is Yahoo given that Yahoo owns roughly 20% of the shares of its spin-off Hortonworks and 16.8% of shares of New Relic.

Categories: Big Data, Hortonworks, New Relic, Venture Capital, Yahoo | Tags:

Create a free website or blog at WordPress.com. The Adventure Journal Theme.