Professional Cloud Database Engineer
240 Minutes
132
Professional Cloud Database Engineer
A: Take nightly snapshots of the primary database instance, and restore them in a secondary zone.
B: Build a change data capture (CDC) pipeline to read transactions from the primary instance, and replicate them to a secondary instance.
C: Create a read replica in another region, and promote the read replica if a failure occurs.
D: Enable high availability (HA) for the database to make it regional.
A: Restore the Cloud SQL instance from the automatic backups in region 3.
B: Restore the Cloud SQL instance from the automatic backups in another zone in region 1.
C: Check 'Lag Bytes' in the monitoring dashboard for the primary instance in the read replica instance. Check the replication status using pg_catalog.pg_last_wal_receive_lsn(). Then, fail over to region 2 by promoting the read replica instance.
D: Check your instance operational log for the automatic failover status. Look for time, type, and status of the operations. If the failover operation is successful, no action is necessary. Otherwise, manually perform gcloud sql instances failover .
A: Place the bulk of the read and write workloads closer to the default leader region.
B: Use staleness of at least 15 seconds.
C: Add more read-write replicas.
D: Keep the total CPU utilization under 45% in each region.
A: Create extracts from your on-premises databases periodically, and push these extracts to Cloud Storage. Upload the changes into BigQuery, and merge them with existing tables.
B: Use Cloud Data Fusion and scheduled workflows to extract data from MySQL. Transform this data into the appropriate schema, and load this data into your BigQuery database.
C: Use Datastream to connect to your on-premises database and create a stream. Have Datastream write to Cloud Storage. Then use Dataflow to process the data into BigQuery.
D: Use Database Migration Service to replicate data to a Cloud SQL for MySQL instance. Create federated tables in BigQuery on top of the replicated instances to transform and load the data into your BigQuery database.