AWS Lambda Delivers Cloud-Based, Zero Administration, Event Driven Computing

Amazon Web Services recently rolled out a service called AWS Lambda that promises to continue Amazon’s history of and reputation for disrupting contemporary cloud computing with yet another stunningly innovative product and service. AWS Lambda allows developers to dispense with the need to create persistent applications that reside on virtual machines or servers. Instead, developers create libraries of code that respond to incoming data streams and perform event-driven computing by leveraging predefined Lambda functions. Lambda functions represent code written in Node.js that execute in response to changes to Amazon S3, data feeds from Amazon Kinesis and updates to tables in Amazon DynamoDB. Developers grant Lambda functions permission to access specific AWS resources, thereby enabling them to activate select AWS infrastructure components as necessary to perform their application logic. Part of the magic of Lambda functions is that they spin up infrastructure components as needed in response to incoming data feeds, and subsequently shut them down when they are not being used, thereby conserving resources and minimizing costs.

An example of Lambda’s capability features the automated creation of thumbnails in response to images uploaded to the S3 platform. All this means that developers no longer need to worry about managing EC2 instances or installing database, application and orchestration frameworks. Computing is managed by AWS Lambda functions and its attendant zero-administration platform. As such, AWS Lambda brings the power of cloud-based event-driven computing to Amazon Web Services and continues Amazon’s tradition of simplifying developer involvement in the provisioning, configuration and management of IT infrastructures. Lambda’s backend infrastructure allows developers to focus on application development as opposed to the management of fleets of EC2 instances by empowering developers to create event-driven applications marked by the execution of code in response to predefined triggers within incoming data streams. The service is currently available in preview mode marked by limitations on the number of “concurrent function requests” and support for JavaScript in addition to Node.js.


Microsoft Announces Updates to Windows Azure

Microsoft today announced a set of updates to Windows Azure, its Platform as a Service (PaaS) cloud computing offering that was launched in February 2010. Microsoft grouped the updates into the categories of Ease of Use, Interoperability and Overall Value. Here are some highlights from the wide-ranging update:

• Node.js language libraries added to the Windows Azure software development kit (SDK)

Windows Azure now supports the Node.js software system, known for its suitability for highly scalable internet applications such as web servers. The Azure software development kit for Node.js includes libraries for blob, table and queue storage as well as PowerShell command-line tools for development.

• Apache Hadoop on Windows Azure – A Preview

Developers seeking to unlock the Big Data potential of Apache Hadoop can now obtain a preview of Hadoop on Windows Azure by taking advantage of a streamlined installation process that sets up Hadoop on Azure in hours as opposed to days.

• Upper Bound on Price for Large Azure Databases

The maximum Azure database size has been increased from 50 GB to 150 GB. Additionally, the maximum price for the 150 GB database has been set at $499.95, resulting in a 67% price decrease for customers using the largest size.

• Lowering of Data Transfer Prices

Data transfer fees in Zone 1 (North America and Europe) have been lowered from $0.15/GB to $0.12/GB. Data transfer fees in Zone 2 (Asia Pacific) have been lowered from $0.20/GB to $0.19/GB.

Additional updates include enhanced tools for Eclipse/Java, MongoDB, SQL Azure Federation, Solr/Lucene and Memcached as well as access to Windows Azure libraries for .NET, Java, and Node.js on GitHub via an Apache 2 software license.