In late November of 2019, AWS announced a new specialty certification focusing specifically on database technologies. The AWS Certified Database - Specialty Exam beta period started in early December of 2019 with the standard certification exam availability target date being April of 2020. At the start of December, I participated in the beta attempting the beta exams. As of this writing, I do not know my results and whether on not I achieved the certification. Nevertheless, I thought it would be useful to share my experience. I'll try to list the topics covered by the exam questions, and provide some insight and resources that are helpful in preparing for the exam.
The database certification exam covers five domains:
- Domain 1: Workload-Specific Database Design (26%)
- Domain 2: Deployment and Migration (20%)
- Domain 3: Management and Operations (18%)
- Domain 4: Monitoring and Troubleshooting (18%)
- Domain 5: Database Security (18%)
The exam's goal is to test user's competence to:
- Understand the various AWS database services
- Recommend and design database solutions appropriate to satisfy specific problem requirements
AWS suggests that the examinee has:
- Minimum of 5 years of experience working with relational and NoSQL databases, including on-prem and cloud based implementations
- Minimum of 2 years of hand-on experience with AWS
- Amazon RDS is one of the core services offered by AWS, so having good understanding and thorough knowledge of this service is critical to succeeding in the exam.
- You should have expertise in analyzing and identifying requirements / use cases when a relational database (and specifically RDS) is the appropriate solution. You should also have a good understanding of the service capabilities and limitations.
- Multi-AZ: Know technical details of its implementation and when it is appropriate (e.g. disaster recovery scenarios)
- Read-Replicas: Know technical details and in what situations is it useful (e.g. off-loading read traffic)
- Know the numerous engine options RDS has available
- There are engine specific questions, so you should have some familiarity with each engine (e.g. Oracle TDE, PostgreSQL specific capabilities)
- Backups
- Option Groups
- DB Parameter Groups
- SQL Performance Insights (what is it, when is it applicable, and how is it different from CloudWatch)
- Alongside RDS, DynamoDB is probably the most crucial of AWS database services. You should know it inside and out.
- Understand Primary Keys/Partition Keys/Sort Keys. Given a particular scenario, be able to design/choose them.
- Global Secondary Indexes / Local Secondary Indexes. Understand difference between them and applicable use cases.
- DynamoDB Streams
- Global Tables (and its use cases)
- Data Modeling
- Partition Sharding (e.g. adding random suffix)
- Composite Keys
- Design patterns and best practices
- Querying and Filtering
- DAX (vs ElastiCache)
- Have a good understanding of the underlying technical architecture
- Amazon Aurora
- Understand the technical architecture and use cases
- Difference between Aurora Read Replicas/RDS Read Replicas
- Scaling of Aurora
- Understand Aurora Serverless
- Understand Aurora Clones and when to you should use them
- CloudWatch
- Understand the service functionality
- How to use in troubleshooting scenarios
- Performance monitoring and notification/alerting scenarios
- Database Migration
- There are questions around AWS SCT (Schema Conversion Tool) and AWS DMS (Database Migration Service). You should have a general understanding of both, difference between them, and when you would use one vs the other.
- ElastiCache
- Have a good understanding of use cases when caching is appropriate
- Focus on Redis
- Understand Redis architecture, Multi-AZ, etc.
- Understand various caching strategies (Lazy Loading, Write-Through) and benefits/limitation of each
- DocumentDB
- I don't recall many questions on this service, however you should have general familiarity with it
- Redshift
- Have a general understanding of data-warehouse topics and columnar storage
- Be familiar with technical architecture (single node deployment / cluster with leader and compute nodes)
- Be familiar with Redshift Spectrum
- Understand use cases for Redshift (e.g. Business Intelligence and Reporting). Be able to differentiate applicability of Redshift vs RDS.
- Neptune
- I don't recall many questions here. However, you should have general understanding of graph database concepts and graph database applications.
- I suggest being familiar with its technical architecture
- Know supported APIs (Gremlin, SPARQL) and models (Property Graph/TinkerPop and W3C RDF/SPARQL)
- Security
- Understand options and solutions for data encryption (both data at rest and data in transit)
- Authentication options and capabilities
- Access Management (IAM)
- Have good understanding of services like Parameter Store and KMS
- Networking
- Understand how to integrate various services with VPC (e.g. VPC Endpoints)
- Know how to secure access at network level
- Solutions Architecture
- There were a number of questions very reminiscent of questions from AWS Solutions Architect Exam. I strongly suggest attaining AWS Solutions Architect Associate Certification before attempting this specialty exam.
- Understand AWS Lambda and how it can be used in conjunction with all the other services mentioned above
- Expect questions on High Availability, Disaster Recovery, and Fault Tolerance
- Deployment and CI/CD
- There were quite a few (more than I expected) questions relating to deployment and CI/CD concepts. I would strongly suggest attaining the AWS DevOps Engineer Professional certification prior to writing this specialty exam (or at least AWS Developer Associate Certification). You need to understand the various DevOps services AWS offers (e.g. CloudFormation, CodeBuild, CodeDeploy) and how they can be utilized to assist deployment scenarios.
As mentioned above, I strongly suggest having both AWS Solutions Architect - Associate and AWS Certified Developer - Associate certifications (or preferably AWS DevOps Engineer - Professional) prior to entering to write this specialty exam. The experience and knowledge required to attain those certifications will go a long way in helping you do well in this exam.
As suggested by AWS, you should have significant experience designing and implementing solutions consisting of various database technologies. This experience should include considerable understanding and practical use of AWS services.
Lastly, I suggest reviewing official AWS documentation, AWS Whitepapers & Guides, blogs, and videos. In particular, I found advanced level Re:Invent and Online Tech Talk videos particularly useful.
Below is a list of few links I found especially informative.
- Amazon Relational Database Service (Amazon RDS)
- AWS re:Invent 2017: Deep Dive on Amazon Relational Database Service (RDS) (DAT302)
- AWS re:Invent 2018: Aurora Serverless: Scalable, Cost-Effective Application Deployment (DAT336)
- Migrating Microsoft SQL to AWS - AWS Online Tech Talks
- Contains a good demo of DMS
- AWS re:Invent 2017: ElastiCache Deep Dive: Best Practices and Usage Patterns (DAT305)
- AWS re:Invent 2018: ElastiCache Deep Dive: Design Patterns for In-Memory Data Stores (DAT302-R1)
- AWS re:Invent 2018: Amazon DynamoDB Under the Hood: How We Built a Hyper-Scale Database (DAT321)
- AWS re:Invent 2018: Accelerate Database Development and Testing with Amazon Aurora (DAT313)
- Data Design and Modeling for Microservices
- AWS re:Invent 2017: Best Practices for Data Warehousing with Amazon Redshift & Redshift Spectrum (ABD304)
- AWS re:Invent 2018: Amazon DynamoDB Deep Dive: Advanced Design Patterns for DynamoDB (DAT401)
- An Overview of AWS Cloud Data Migration Services
- Migrating Applications Running Relational Databases to AWS: Best Practices Guide
- AWS Database Migration Service Best Practices
- Migrating Your Databases to Amazon Aurora
- Best Practices for Migrating MySQL Databases to Amazon Aurora
- Strategies for Migrating Oracle Databases to AWS
- Best Practices for Running Oracle Database on AWS
- Deploying Microsoft SQL Server on AWS
- Best Practices for Deploying Microsoft SQL Server on AWS
- Performance at Scale with Amazon ElastiCache
- Database Caching Strategies Using Redis
- Getting Started with Amazon DocumentDB (with MongoDB Compatibility)
In my opinion, AWS Database Specialty is an excellent addition to the AWS Certification program. Database technologies are an integral and crucial part of any cloud based solution. Having a strong competency in these technologies is critical to being a successful architect and designing apt solutions.
I hope this guide and contents within are helpful to those preparing to attempt the exam in the future.
Good Luck!