AWS’s Rate of Innovation Is Unmatched

I just returned from a week in Las Vegas at AWS re:Invent, Amazon Web Services’ annual conference. I have either attended or watched the live stream every year for the past several years, and I am continually amazed at the number of new services and features that AWS cranks out annually. During the course of each year, I keep reading about how the other public cloud providers are gaining ground on AWS. However, I am not seeing that. Amazon is dominating with large enterprises and Fortune 500 companies. Many of the big wins from the other cloud providers are in companies looking at multicloud strategies or targeting specific workload types (e.g., Google for big data workloads).

Day 1 Announcements

What I witnessed last week was an onslaught of feature releases. Andy Jassy, CEO of AWS, announced a long list of new services.

  • Amazon Lightsail allows users to choose a configuration and launch a virtual machine preconfigured with SSD-based storage, DNS management, and a static IP address. Plans start at $5/month.
  • Amazon Athena is a serverless interactive SQL query service that enables easy analysis of large amounts of data stored in Amazon S3.
  • Artificial Intelligence on AWS
    • Amazon Rekognition is a powerful image detection and recognition solution powered by deep learning and now available to use in your own application. Built over many years, Amazon Rekognition can comprehend scenes, objects, and faces.
    • Amazon Lex uses the same deep learning technologies as Amazon’s Alexa to enable users to build conversational voice and text interfaces.
    • Amazon Polly converts text to speech in forty-seven voices and twenty-four languages.
  • AWS Greengrass is built for offline operations, simplifying the implementation of local processing and addressing connectivity issues.
  • AWS Snowball Edge expands the scope of Snowball by adding more connectivity, storage, horizontal scalability, local endpoints, and Lambda functionality.
  • AWS Snowmobile is a 45-foot-long shipping container that can store up to 100 PB of data and can help users move exabytes to AWS in a matter of weeks.
  • The Amazon Aurora Update (PostgreSQL-Compatible Edition) offers high durability, high availability, and the ability to quickly create and deploy read replicas, while delivering 2x the performance of PostgreSQL running in traditional environments.
  • EC2 Instance Type Update
    • New F1 instances give users access to programmable hardware that enables code to be written and run up to thirty times faster.
    • New R4 instances have larger L3 cache and higher memory speeds for memory-intensive business intelligence.
    • Expanded T2 instances offer the same baseline performance with the added ability to burst to entire core when extra computing power is needed.
    • New Elastic GPUs allow users to add high-performance graphics acceleration to existing EC2 instance types.
    • New I3 instances are designed for users who need the most demanding I/O intensive relational and NoSQL databases, transactional, and analytics workloads, delivering up to 3.3 million random IOPS at a 4 KB block size.
    • New C5 instances run faster than any other EC2 instance processor. They supporting ENA on the network side and are EBS-optimized by default.

Jassy talked about AWS’s rate of innovation. By the end of 2016, AWS will have released nearly a thousand new services. That is an average of three new things each day. I have sat through vendor pitches in which legacy vendors clinging on for dear life claimed that they are innovating faster than AWS. Nothing could be further from the truth.

AWS innovation

After every announcement, I could picture a category of vendors shivering in their boots. With AWS Aurora adding Postgres to its toolbox, relational database vendors like Oracle have to be concerned. AWS is making it easier each year for enterprises to make the switch from traditional database technologies to managed database services on AWS. The new AI offerings (Lex, Polly, Rekognition) close a feature gap that AWS has when going head to head with Google. While these services are likely not yet as mature and robust as the Google Vision and Speech APIs, it offers the buyer who is already on Amazon yet another reason not to look beyond AWS for their cloud service needs.

Andy closed the session by wheeling out a huge semi onstage for the new AWS Snowmobile, which is capable of backing up data for enterprises housing exabytes of data. The public cloud provider who can get the most customer data will win, and nobody is making it easier to transfer truckloads (pun intended) of data to the cloud than AWS.

Source: Venturebeat
Source: Venturebeat

Day 2 Announcements

If that wasn’t enough muscle to flex, on day two, Amazon CTO Werner Vogels followed up with another impressive list of new services.

  • AWS Config now integrates with your Amazon EC2 Systems Manager to provide continuous monitoring and governance of your EC2 instances and on-premises systems.
  • AWS CodeBuild streamlines your development process, eliminating the need to provision servers ahead of time and allowing you to scale according to your build volume.
  • AWS Personal Health Dashboard gives you a personalized view into the performance and availability of your AWS services.
  • AWS X-Ray allows users to trace data from code running on EC2 instances, AWS Elastic Beanstalk, Amazon API Gateway, and more from beginning to end to give you the visualization capabilities you need to see deeper into applications.
  • AWS Shield is a new managed service that protects web applications from distributed denial of services (DDoS). There are two tiers: AWS Shield Standard and AWS Shield Advanced.
  • Amazon Pinpoint helps you measure and improve user engagement for your mobile apps by using real-time analytics to better understand your users’ behavior.
  • AWS Glue guides you through the process of moving data by preparing and loading your data for easy transfer between data stores.
  • AWS Batch allows batch admins, developers and users to have access to the power of the cloud without managing, monitoring, or maintaining clusters.
  • Blox is a new open-source scheduler for Amazon EC2 Container Service that consumes an event stream, uses it to track the state of the cluster, and makes the state accessible via a set of REST APIs.
  • The Lambda@Edge preview allows users to write JavaScript code that runs within the network of AWS edge locations.
  • AWS Step Functions allow users to coordinate the components of applications as a series of steps in a visual workflow.

Werner drove home the fact that AWS is a serious player in IoT and containers. AWS has a brilliant distribution strategy for IoT. It has agreements with some of the world’s predominant chip suppliers to deliver new chips with AWS’s Lamba functionality embedded, so developers will have the same Lambda SDK on chip that they have on cloud. No other cloud provider offers this option, which can give AWS a huge advantage as enterprises start to embrace IoT.

Werner drove home a couple of very important points. One is that security trumps everything, and the company has invested heavily in adding to its already industry-leading security feature set. AWS Shield is a managed service for protecting against denial of service attacks and is free for customers. AWS Shield Advanced is a paid service that provides even more functionality and includes access to AWS’s DDoS Response Team.


The second point Werner hammered on was agility. The more time we spend doing IT plumbing work, the less time we spend delivering business value. AWS released a number of services that abstract away a lot of the plumbing work of existing services and processes. EC2 Systems Manager simplifies infrastructure management, CodeBuild simplifies the build and deployment pipeline, Glue streamlines data ingestion and ETL processes, and OpsWorks for Chef takes the pain out of managing Chef servers by providing a managed service.

Werner reminded us that 80% of the work that goes into producing analytics and insights is not analytics. He listed ten categories of work that are required to make analytics actionable.

  1. Ingestion
  2. Preserving original source of data
  3. Lifecycle management
  4. Capturing metadata
  5. Managing governance, security, privacy
  6. Self-service discovery, search, access
  7. Managing data quality
  8. Preparing for analytics
  9. Orchestration and scheduling
  10. Capturing data change

AWS made a major investment in a suite of services to reduce the effort required to make data actionable so that developers and data scientists can spend more time working with real data instead of mucking with hardware and software. A large number of services and features within existing services were added to the AWS service catalog, with a focus on automation and simplicity.


Moving Forward

Jassy and Werner announced so many new features that it would require a dozen blog posts to cover them all. They released a number of features focusing on serverless, containers, monitoring and logging, and much more that I have not covered. They released X-Ray, which poses a serious threat to APM providers like New Relic, AppDynamics, and many others. AWS also released services focusing on hybrid clouds, including a partnership with VMware, which is still light on the details. It increased the size of the Snowball to 100 TB and even made it programmable. The rate of innovation coming out of AWS is just mindboggling.



AWS continues to pioneer cloud computing with first in market services. It makes sense that AWS is the first to bring these services to market. A large majority of companies that are using the public cloud are on AWS, and the majority of customers who have years of experience solving complex issues in the cloud have been doing it on AWS. Its strategy seems to be twofold: defend and extend. The defend strategy focuses on closing feature gaps that it has (machine learning, AI, etc.) The extend strategy focuses on continuing to mature, enhance, and abstract away complexities in areas where they already have the lead (server automation, CI/CD automation, container abstraction, IoT on the edge, etc.) We still have many years ahead until this market matures, and each of the big three public cloud providers will have their share of big wins, but don’t believe the hype that any vendor is gaining on AWS. AWS has a several-year advantage on the competition and continues to innovate while the others are trying to catch up on features AWS delivered two to three years ago.

It will be interesting to see how AWS will top this, but you can bet that its service catalog will be even more amazing at this time next year. And for you private cloud fans: at what point will you realize that the developer agility that the public cloud providers are enabling with a rich suite of services and abstraction far exceeds anything anyone will ever be able to provide on-premises? Chew on that while AWS releases its next 1,500 to 2,000 services in 2017.