The new A1 EC2 Instance

At re: Invent 2018 AWS  introduced A1 instances. Compared to the other instance types they cost 40% lesser ‘per core’. Given that they use the Nitro Hypervisor, they give a better performance as well compared to traditional Xen Hypervisor based instances.

However, before you go and move all your instance types, it is good to know when you can use these instance types and the effort required in moving your workload to use these instances.


A1 instances have ARM-based processors. If your workloads compile to native code using x86 architecture, then you would need to recompile for ARM platform before you can run on them.

For scripting based languages, it could be negligible ( as long as they do not use some module that is native in their dependency chain).

If you use Docker containers, It is relatively quick as mentioned in

Amazon Linux, Ubuntu Linux, and Red Hat Enterprise Linux are the initial operating systems with ARMv8 support on EC2.

A1 instances have slightly dated ARM A72 processors (released in 2015. The current generation is A76) that are aimed at high-end smartphones and tablets. So they aren’t meant for same workloads such as Xeon E5 series powering the Cx series instance types for servers. In terms of benchmarks by Phoronix, both Intel and AMD based instances far outperform the current gen A1 instances.

Interestingly enough is the price/performance per dollar


Courtesy: Phoronix benchmark

In terms of ‘real world’ test of hosting website, they still underperform by about 3.5x ( albeit at a lower cost). (

Currently, A1 instances are not meant for general purpose workloads. However, owing to the Nitro system based hypervisor, they will be very useful as part of scale-out workloads, lightweight web servers,

containerized micro-services, caching fleets and such.

However, there is a larger trend at play which will be beneficial to customers in the long run. Amazon bringing their own processor into the mix with Intel and AMD will improve choices and hopefully reduce costs in the long run.