Q&A With Jon Gacek, CEO Of Quantum, On The Rise Of Corporate Video Usage And The Storage Industry

Cloud Computing Today recently spoke with Jon Gacek, CEO of Quantum, about the demands that the proliferation of video-based data imposes on the storage needs of Quantum’s customers. As noted below, Mr. Gacek elaborated on the way in which increasing volumes of video-based data creates special challenges for customers that are driving them to seek storage solutions with the performance, scalability, data retention, automation and indexing capabilities specific to Quantum and its partners.

Cloud Computing Today Question #1: How do you envision the effect of trends in the usage of IP video traffic on the storage industry?

Jon Gacek, (CEO, Quantum) Response #1: The increasing amount of video is creating greater storage demands, including the need to ingest larger files at high speeds, to retain that data cost effectively for long periods of time and to ensure the data can be easily accessed and shared for collaboration, analysis and re-monetization or other reuse. All of this creates new opportunities for the storage industry to help customers.  However, as video transitions from HD to 4K and even higher resolutions, general-purpose storage companies are finding it increasingly difficult to meet the performance demands associated with storing and managing such data.  As a result, customers are turning to storage providers like Quantum that can offer specialized solutions optimized for video workflows.

Cloud Computing Today Question #2: What challenges does the storage industry face in relation to the rise of corporate video?

Jon Gacek, (CEO, Quantum) Response #2: As is the case with video, generally, the rise of corporate video creates new data storage and management challenges for customers and opportunities for specialized storage providers to help them meet those challenges.  A key element of this is understanding that the customer looking for a corporate video solution often isn’t in IT.  For example, it may be a marketing leader trying to leverage video assets across multiple platforms and channels to drive greater brand awareness or promote a new product.   They don’t have time to become a video storage expert, but they also want to maintain control of their video rather than depend on an already over-burdened IT department.  This means they want solutions that are easy to use, include a high level of automation and cost as little as possible.  And it’s this cost factor that makes the ability to offer tiered storage solutions such a differentiator in the storage industry.  Moving video data off expensive primary storage to lower-cost disk, object storage, tape or cloud tiers can result in significant savings.  At the same time, it’s critical that the storage system be able to keep track of where that data resides and provide ready access to it when needed.

Cloud Computing Today Question #3: How will the storage industry tackle the problem of indexing unstructured data (such as a notable moment within a soccer video that might be untagged because it is unrelated to a goal, for example)?

Jon Gacek, (CEO, Quantum) Response #3: It’s definitely true that being able to add structure to unstructured data through analytics and indexing enables you to get much greater value out of your data.  In the video space, there are a number of companies that provide such capabilities.  In fact, we have a partner with some great technology that provides high-speed automated video and image search.  You identify the image or kind of shot you’re looking for, and the system can search through massive amounts of data incredibly fast to deliver the results – hours of video can be searched in minutes.  This has a wide range of applications, from the sports video example you referenced to government intelligence and counterterrorism to video surveillance. Video surveillance is also a good example of how technology is enabling benefits beyond those originally envisioned for crime prevention and prosecution. For example, municipalities are now using it to monitor and analyze activities at ports to identify ways to increase logistics efficiencies, and retailers are analyzing the data they get from in-store cameras to understand shopping patterns and how the placement of goods can be optimized to increase sales and bolster customer retention.  However, none of this is possible unless you have the underlying storage infrastructure that can handle the special demands of managing, preserving and delivering video data.

Sponsored Post: The Misadventures of Cloud Computing

The following post represents a republication of this Virtustream blog post with a few minor edits to the text in paragraphs one and two. The post was authored by the Virtustream editorial team.

In collaboration with Virtustream, Cloud Computing Today is excited to debut the first illustration in a five-part cartoon series by Tom Fishburne, “The Marketoonist.” With some techie humor and a touch of irreverence, our “Misadventures of Cloud Computing” series sheds light on the day-to-day challenges facing CIOs and IT leadership teams as they navigate the complex enterprise cloud landscape.

Cloud Computing Today will be unveiling a new Virtustream cartoon every Monday for the next five weeks that puts a comical spin on what really matters when selecting an enterprise cloud solution – security, reliability and performance. We hope you check back in regularly for a midday chuckle and we encourage you to share your perspective and experiences on each cartoon’s theme.

Server Huggers. We all know them – the folks that are hesitant to say goodbye to something they can touch and feel, physical servers, for something distant and and intangible. And while it is not actually about “where to put the coffee maker,” cloud reluctance is usually an emotional reaction. Change can be unsettling.
At first blush, it makes sense. Enterprise IT departments manage complex landscapes and moving complicated, mission-critical legacy apps to the cloud is no small feat. And the thought of experiencing any downtime during the transition is a disconcerting one. Often times the stress and complexity of the transition can be misinterpreted as an aversion to the cloud all together.

But transitioning your enterprise to the cloud, even in the most complicated instances, can be a smooth, secure ride if you have the right partners on board to lead you through the journey. And while some IT departments feel like the servers they can see and touch are more safe or dependable than ones they can’t, both security and reliability are fundamental to enterprise-grade cloud service providers who offer continuous enterprise-wide monitoring on a large scale. They have a big stake in ensuring that your data remains safe and you experience zero downtime during and after the move to the cloud.

While it may be counterintuitive at first, moving to the cloud helps enterprise IT gain control of their systems and data, not the other way around. When less time and money is spent managing hardware and day-to-day upkeep, IT can put more resources into pursuing interesting projects that could make a significant impact on the business.

For more information, check out Virtustream’s LinkedIn page.

Virtustream is the enterprise-class cloud software and service provider trusted by enterprises worldwide to migrate and run their mission-critical applications in the cloud. For enterprises, service providers and government agencies, only Virtustream’s xStream™ cloud management platform (CMP) software and Infrastructure-as-a-Service (IaaS) meet the security, compliance, performance, efficiency and consumption-based billing requirements of complex production applications in the cloud – whether private, public or hybrid.

IBM Acquires Green Hat, Cloud Based Software Testing Company

IBM announced its first acquisition of 2012 on Wednesday by purchasing Green Hat, the software quality and testing company based in Wilmington, Delaware and London, England. Green Hat delivers a cloud based testing environment that enables developers to test software applications without the hassle of setting up the hardware and software required for the testing simulation environment. Green Hat’s cloud testing solutions are particularly useful for rapid application development with ultra-short development timelines such smartphones and tablet applications. Green Hat shrinks the percentage of software development costs devoted to testing and simulates a wide variety of IT infrastructures used in development processes. Upon acquisition, Green Hat will join IBM’s Rational Solution for Collaborative Lifecycle Management to enable enterprise customers to optimize their testing processes and accelerate software delivery. The acquisition of Green Hat is expected to improve IBM’s application delivery lifecycle in addition to that of its enterprise customers. Green Hat was founded in 1996 by Peter Cole, the company’s CEO. The majority of its 45 employees are based in London. Terms of the acquisition were not disclosed.

Toyota Plans to Leverage Microsoft Azure for Telematics Services Focused Around Energy Management

Microsoft Corporation and Toyota Motor Corporation’s announcement that the Microsoft Azure cloud computing platform will host telematics applications for Toyota’s electric and plug-in hybrid vehicles marks an important step in the battle for enterprise market share amongst the top cloud computing vendors. Disclosed on April 6, the agreement signifies Microsoft’s increasing dominance in the automotive vertical as it expands its market base beyond Ford, Kia and Fiat. Microsoft and Toyota plan to invest $12 million or $1 billion yen in telematics services for the Toyota subsidiary, Toyota Media Service. Telematics marks the conjunction of information technology with telecommunications in a way that allows users to obtain increased control of energy management, multimedia and location-related services. Expected features of the Microsoft Azure based platform include:

• The ability to determine when to most economically recharge an electric battery in relation to energy costs
• A mobile application that checks battery levels and calculates how far users can drive before recharging their battery
• The ability to manage home energy and air conditioning units from automobiles
• Increased customization over streaming audio and video content

Initial deployment of telematics applications is expected amongst Toyota’s electric and plug-in hybrid vehicles in 2012. Toyota plans to deploy a global, Azure based platform to provide advanced telematics services to its customers by 2015. The initial roll-out of Toyota’s partnership with Microsoft will focus on energy management but the platform will more broadly enhance information access and control for its users. Toyota plans to render its telematics platform available to other car manufacturers as well in a move that would increase standardization within the automotive vertical with respect to driver access to energy, entertainment and location information.

Rackspace Targets Startups With its Rackspace Startup Program

Rackspace formally announced a program designed to target start-up companies as customers for its cloud computing products and services on March 11. Titled the “Rackspace Startup Program,” the strategy makes available Rackspace’s cloud computing offering to startups that are part of incubator and accelerator programs such as 500 Startups, TechStars, Y Combinator and General Assembly. Based on the understanding that Rackspace itself was a startup, the program offers customized guidance about deploying applications within a cloud computing environment alongside its Rackspace and OpenStack cloud resources. The program offers yet another illustration of divergences between Rackspace’s business model and that of Amazon Web Services. Whereas Amazon Web Services represents a pure product offering, Rackspace provides product enhanced services that complement its cloud offering with consulting services such as those recently formalized by its Cloud Builders service line. Dubset marks an example of a startup that uses Rackspace’s services to stream music and track and perform analytics on what gets played. In a note on Rackspace’s blog, Dubset reports that their “costs are low and we never have to worry about our Cloud Servers.” Alongside the release of its Startup Program, Rackspace also announced the availability of version 2.0 of its cloud computing application, Rackspace Cloud 2.0, which can additionally be accessed by iPhone, iPad and iPod Touch. The free iPhone and iPod Touch application enables cloud managers and development teams to access and transform their cloud computing environments while away from their desktop or laptop consoles.

The Case for Cloud Computing in Large Scale Enterprises

With IT budgets increasingly stretched thin, CIOs in large scale enterprises are considering cloud computing because it offers an alternative to data centers containing expensive servers that need to be constantly maintained by expensive technicians and periodically replaced. Alongside IT hardware, electricity constitutes the other notable expense associated with data centers and represents a commodity that, like capital investments in equipment, stands to be decreased as a result of a transition to a cloud computing model of software delivery and development. In a cloud computing environment, subscribers pay only for usage of instances of a server and hence avoid both the capital investment in servers and the constant drain of power and ventilation required to maintain data centers.

Large enterprises additionally face the challenge of maintaining chaotic assemblages of hardware that have been acquired over generations and vary considerably depending on the needs of each business unit. Upgrading fleets of servers and machines represents a significant logistical and technical challenge that requires standardizing security applications over a wide range of machines and operating systems. A cloud computing model enables large enterprises to render their security policy uniform and scalable as its hardware needs change over time.

Cloud computing also offers opportunities for collaboration and innovation amongst geographically dispersed business units that confront technological challenges in sharing their work and methodologies. By using a cloud computing model that provides an interface to a set of shared resources, geographically disparate employees have the opportunity to share information and innovate through a common platform with standardized tools. Moreover, a transition to cloud computing frees up IT staff to drive innovation within their business units instead of dedicating resources to maintaining data center resources whose capacity has been increasingly utilized given the proliferation of automated, computationally intensive or web-based technologies. IT executives now have the freedom to collaborate closely with strategic business leadership to determine how best to utilize technology to foster innovation, shorten delivery cycles and decrease costs.

Factors that large enterprises need to consider when transitioning to cloud computing include the following:

• Cost savings (a demonstrated return on investment due to decreased costs for investment in hardware, space, ventilation and electricity)
• Visibility to the performance and processes of Virtual Machines (VM) enabled by technology known as virtualization that hosts more than one image or instance of a machine on a single physical machine
• Control over deployment of the cloud and the ability to edit settings, provision or cancel machines at will and deploy additional technologies
• Security considerations and the opportunity to standardize security software across a diverse range of machines housed in different business units, enabling a more robust, enterprise wide security policy
• Accounting concerns related to integration of cloud computing costs with the enterprise’s accounting processes

Additionally, the opportunity to foster increased organizational collaboration and innovation represents another factor to consider as large enterprises consider a transition to cloud computing by way of a more easily accessible IT infrastructure for exchanging ideas across different business units and geographically dispersed offices.

Despite the benefits of transitioning to a cloud computing for a large scale enterprise, executing the transition from a data center environment to a cloud based approach requires significant planning and strategic leadership. Security and system uptime remain the two issues most frequently cited by CIOs as deterrents to executing a cloud based IT strategy. For large scale enterprises, security and system uptime concerns can be most effectively managed by a robust governance process that takes responsibility for service level agreements, security, regulatory compliance and ensuring that the enterprise’s IT strategy is synchronized with the company’s larger business strategy. Robust governance processes need to be supplemented by a cloud strategy that spells forth how people, processes and technology will enable the transition to a cloud based model of computing for the organization.

Company Profile: Joyent and Application Virtualization

Joyent is a cloud computing vendor based in San Francisco. Founded by David Young (CEO) and David Hoffman (Chief Scientist) in 2004, Joyent is an Infrastructure as a Service vendor whose business model targets large scale enterprises, particularly in the online gaming space. Joyent has created its own technology stack called SmartDataCenter that it either licenses to third party customers or uses to deliver cloud computing services directly to customers such as LinkedIn, Kabam and the Gilt Groupe. Unlike other cloud computing vendors, Joyent takes virtualization a step further than hardware virtualization by virtualizing its cloud computing operating system over a pool of hardware resources that guarantees applications access to hardware resources. Because Joyent’s cloud computing operating system is virtualized, every application that operates on its SmartOS platform has access to its entire fleet of servers, with the result that customers need not create procedures for provisioning additional server resources as necessary. Applications on Joyent’s SmartOS platform are de facto deterritorialized across Joyent’s collective pool of hardware resources.

Joyent’s website describes its application virtualization as follows:

The SmartMachine has been designed to be very transparent to the underlying operating system, Joyent SmartOS. SmartOS uses this visibility into the SmartMachine to provide all SmartMachines with as-needed access to a pool of all available resources on a single piece of hardware while still providing each SmartMachine with minimum guaranteed access to resources based on a pre-established fair share schedule. This transparency also allows the underlying operating system, Joyent SmartOS, to identify underutilized resources and use them to provide enhanced application performance management. In normal operating conditions, all RAM and CPU resources are either directly used by applications, or are being used by the operating system to optimize disk I/O and provide other performance enhancements to the SmartMachines.
Source: “Joyent: Application Virtualization Hosting”

Joyent’s claim here is that the virtualization of the SmartOS operating system on which all of its applications run enables it to maximize productivity by identifying “underutilized resources” that can in turn be deployed to enhance “application performance management.” A series of third party benchmarking tests by the IMS company claimed that Joyent SmartMachines, Windows Virtual Machines and Linux Virtual Machines outperformed its Amazon EC2 server counterparts. Specifically, the IMS company claimed that Joyent SmartMachine’s disk I/O, Linux Virtual Machine CPU and Windows Virtual Machine disk I/O were faster than the corresponding Amazon EC2 machine by factors of 14, 5 and 2 respectively. Everyone in the cloud computing space knows that benchmarking tests are notoriously difficult to appraise, but Joyent’s willingness to position itself directly against Amazon EC2 in both press releases and company webinars speaks volumes about its confidence to execute.

Joyent’s CTO Mark Mayo attributes its performance to its application virtualization design: “Most people have resigned themselves to painfully slow disk I/O in the cloud,” Mayo noted. “But these results demonstrate that they don’t have to settle for mediocrity. Joyent’s cloud architecture uses lightweight virtualization that doesn’t impose overhead on I/O, so SmartMachines are as much as 14 times faster than Amazon’s EC2 machines.”

Performance marks one of many factors to consider when choosing a cloud computing vendor, but the IMS Company’s results nevertheless beg the question of whether Amazon’s market share leader position has compromised the performance and speed of its EC2 product offering. Conversely, cloud customers will need to consider whether Joyent has the capacity to accommodate more and more enterprise customers that are likely to strain an infrastructure that already supports computationally intensive applications from customers such as Kabam, Social Gaming Universe and Neverbug Entertainment and ZooLife.