re:Invent 2016 – AWS Provides More Compute Options to Support Wider Workloads. CEO Andy Jassy says that AWS’s “superpower” is its speed in innovation. Jassy spoke Wednesday at the AWS re-Invent keynote in Las Vegas. The event was attended by 32,000 people and streamed live by an additional 50,000 viewers around the world. Jassy stated that the cloud platform is set to add over 1,000 new capabilities and services by the end of this year. Jassy’s keynote revealed many of these, including a slew new Elastic Compute Cloud instances (EC2) to support a wider range workloads. AWS has expanded its T2 general-purpose, burstable instances lineup with the new t2.xlarge (both generally available). The t2.xlarge supports 16 GiB memory and four virtual CPUs (vCPUs), while t2.2xlarge supports 32 GiB memory and 8 vCPUs. The new memory-optimized R4 family of instances is also available. These instances include the r4.large, which has 15.25 GiB memory and two CPUs, and the r4.16xlarge with 488 GiB of memory (and 64 vCPUs). The R4 instances run on Intel’s Broadwell processors and have twice as much memory and speed as the R3 instances. They are optimized for BI, database, and in-memory cache applications. EC2 Elastic GPUs are currently in preview. These allow users to add GPU memory to existing instances — anywhere from 1 GiB up to 8 GiB. This is ideal for heavy workloads such as industrial design or 3-D modeling. Jassy stated that it is extremely useful if you only need a small amount of GPU, but not enough instances. Jassy also revealed the developer preview of new F1 instances that feature field-programmable arrays (FPGAs). Jassy stated that F1 instances are designed for workloads that require hardware acceleration, such as financial analysis and genomics. There are currently two F1 instances: the f1.2xlarge, which has one FPGA and eight vCPUs with 122 GiB memory, and the f1.16xlarge, which has eight FPGAs, 64 CPUs and 976 Gb of memory. AWS will also release an FPGA developer Amazon Machine Image and an F1 hardware developer kit (HDK) in addition to the F1 instance preview. Jassy stated that the F1 instances will be made available to all “in the coming week”. AWS has two new instance types in the pipeline for the first quarter 2017: the I3, which is for I/O-intensive workloads, as well as the C5, which is based on Intel Skylake processors and is optimized for compute-optimized workloads. Al Hilwa, IDC Program Director, said that the new EC2 instances reflect a sharpening emphasis on machine learning and “streaming computes.” Hilwa stated in a research note that developers will be happier having a slice of GPU than paying for it every time they use it. FPGA instances will be used for highly customized computation workloads that use floating point numbers. Gaming and other testing applications are the most prominent examples. However, the main change compared to a few years ago is the rise in image, video, and audio stream processing. This is often done in the context preparing data for machine-learning. More at re:Invent 2016.

re:Invent 2016 – AWS Provides More Compute Options to Support Wider Workloads. CEO Andy Jassy says that AWS’s “superpower” is its speed in innovation. Jassy spoke Wednesday at the AWS re-Invent keynote in Las Vegas. The event was attended by 32,000 people and streamed live by an additional 50,000 viewers around the world. Jassy stated that the cloud platform is set to add over 1,000 new capabilities and services by the end of this year. Jassy’s keynote revealed many of these, including a slew new Elastic Compute Cloud instances (EC2) to support a wider range workloads. AWS has expanded its T2 general-purpose, burstable instances lineup with the new t2.xlarge (both generally available). The t2.xlarge supports 16 GiB memory and four virtual CPUs (vCPUs), while t2.2xlarge supports 32 GiB memory and 8 vCPUs. The new memory-optimized R4 family of instances is also available. These instances include the r4.large, which has 15.25 GiB memory and two CPUs, and the r4.16xlarge with 488 GiB of memory (and 64 vCPUs). The R4 instances run on Intel’s Broadwell processors and have twice as much memory and speed as the R3 instances. They are optimized for BI, database, and in-memory cache applications. EC2 Elastic GPUs are currently in preview. These allow users to add GPU memory to existing instances — anywhere from 1 GiB up to 8 GiB. This is ideal for heavy workloads such as industrial design or 3-D modeling. Jassy stated that it is extremely useful if you only need a small amount of GPU, but not enough instances. Jassy also revealed the developer preview of new F1 instances that feature field-programmable arrays (FPGAs). Jassy stated that F1 instances are designed for workloads that require hardware acceleration, such as financial analysis and genomics. There are currently two F1 instances: the f1.2xlarge, which has one FPGA and eight vCPUs with 122 GiB memory, and the f1.16xlarge, which has eight FPGAs, 64 CPUs and 976 Gb of memory. AWS will also release an FPGA developer Amazon Machine Image and an F1 hardware developer kit (HDK) in addition to the F1 instance preview. Jassy stated that the F1 instances will be made available to all “in the coming week”. AWS has two new instance types in the pipeline for the first quarter 2017: the I3, which is for I/O-intensive workloads, as well as the C5, which is based on Intel Skylake processors and is optimized for compute-optimized workloads. Al Hilwa, IDC Program Director, said that the new EC2 instances reflect a sharpening emphasis on machine learning and “streaming computes.” Hilwa stated in a research note that developers will be happier having a slice of GPU than paying for it every time they use it. FPGA instances will be used for highly customized computation workloads that use floating point numbers. Gaming and other testing applications are the most prominent examples. However, the main change compared to a few years ago is the rise in image, video, and audio stream processing. This is often done in the context preparing data for machine-learning. More at re:Invent 2016.

November 11, 2022 Off By Evelyn
  • What AWS Offers to Mobile Developers at Re:Invent 2016
  • AWS Provides In-Depth Debugging for Developers with X-Ray
  • AWS Intros Visual Workflows for Distributed Apps
  • AWS Launches Shield Service To Stop DDoS Attacks
  • AWS simplifies Virtual Private Servers with Lightsail
  • AWS’s Snowball Data Transfer Appliance is Supersized by AWS
  • Amazon AI Services Now Available to Help Developers Build Smarter Apps
  • Amazon Athena enables serverless SQL queries of S3 Data
  • AWS offers more workload options with compute options
  • Amazon Launches APN To Benefit Partners Who Specialize
  • IoT, Financial Competencies Come to AWS Partners
  • Expert Partners Found for a New AWS Tool
  • <