Amazon
Data-Engineer-Associate
80
AWS Certified Data Engineer - Associate (DEA-C01)
A: Store self-managed certificates on the EC2 instances.
B: Use AWS Certificate Manager (ACM).
C: Implement custom automation scripts in AWS Secrets Manager.
D: Use Amazon Elastic Container Service (Amazon ECS) Service Connect.
A: Use AWS Glue DataBrew to perform extract, transform, and load (ETL) tasks that mask the PII data before analysis.
B: Use Amazon GuardDuty to monitor access patterns for the PII data that is used in the engineering pipeline.
C: Configure an Amazon Made discovery job for the S3 bucket.
D: Use AWS Identity and Access Management (IAM) to manage permissions and to control access to the PII data.
E: Write custom scripts in an application to mask the PII data and to control access.
A: Create data quality checks for the source datasets that the daily reports use. Create a new AWS managed Apache Airflow cluster. Run the data quality checks by using Airflow tasks that run data quality queries on the columns data type and the presence of null values. Configure Airflow Directed Acyclic Graphs (DAGs) to send an email notification that informs the data engineer about the incomplete datasets to the SNS topic.
B: Create data quality checks on the source datasets that the daily reports use. Create a new Amazon EMR cluster. Use Apache Spark SQL to create Apache Spark jobs in the EMR cluster that run data quality queries on the columns data type and the presence of null values. Orchestrate the ETL pipeline by using an AWS Step Functions workflow. Configure the workflow to send an email notification that informs the data engineer about the incomplete datasets to the SNS topic.
C: Create data quality checks on the source datasets that the daily reports use. Create data quality actions by using AWS Glue workflows to confirm the completeness and consistency of the datasets. Configure the data quality actions to create an event in Amazon EventBridge if a dataset is incomplete. Configure EventBridge to send the event that informs the data engineer about the incomplete datasets to the Amazon SNS topic.
D: Create AWS Lambda functions that run data quality queries on the columns data type and the presence of null values. Orchestrate the ETL pipeline by using an AWS Step Functions workflow that runs the Lambda functions. Configure the Step Functions workflow to send an email notification that informs the data engineer about the incomplete datasets to the SNS topic.