Amazon has announced Snowball Edge, an on-premises appliance that supports Amazon EC2 (Elastic Compute Cloud), AWS Lambda (“serverless” computing) and S3 (Simple Storage Service), all running locally.
Sounds like Microsoft’s Azure Stack? A bit, but the AWS appliance is tiny by comparison and therefore more limited in scope. Nevertheless, it is a big turnaround for the company, which has previously insisted that everything belongs in the cloud. One of the Snowball Edge case studies is the same general area as one used by Microsoft for Azure Stack: ships.
The specifications are shy about revealing what is inside, but there is 100TB storage (82TB usable), 10GB, 20GB and 40GB network connections (GBase-T, SFP+ and QSFP+), size is 259x671x386mm (pretty small), and power consumption 400 watts.
Jeff Barr’s official blog post adds that there is an “Intel Xeon D processor running at 1.8 GHz, and supports any combination of instances that consume up to 24 vCPUs and 32 GiB of memory.”
You can cluster Snowball Edge appliances though so substantial systems are possible.
Operating systems currently supported are Ubuntu Server and CentOS7.
Amazon’s approach is to extend its cloud to the edge rather than vice versa. You prepare your AMIs (Amazon Machine Instances) in the cloud before the appliance is shipped. The very fast networking support shows that the intent is to maintain the best possible connectivity, even though the nature of the requirement is that internet connectivity in some scenarios will be poor.
A point to note is that whereas the documentation emphasises use cases where there are technical advantages to on-premises (or edge) computing, Barr quotes instead a customer who wanted easier management. A side effect of the cloud computing revolution is that provisioning and managing cloud infrastructure is easier than with systems (like Microsoft’s System Center) designed for on-premises infrastructure. Otherwise they would not be viable. Having tasted what is possible in the cloud, customers want the same for on-premises.