ExamQuestions.com

Register
Login
AWS Certified Big Data Specialty (Expired on July 1, 2020) Exam Questions

Amazon

AWS Certified Big Data Specialty (Expired on July 1, 2020)

33 / 370

Question 33:

HikeHills.com (HH) is an online specialty retailer that sells clothing and outdoor refreshment gear for trekking, go camping, boulevard biking, mountain biking, rock hiking, ice mountaineering, skiing, avalanche protection, snowboarding, fly fishing, kayaking, rafting, road and trace running, and many more.
HH runs their entire online infrastructure on java based web applications running on AWS. The HH is capturing clickstream data and use custom-build recommendation engine to recommend products which eventually improve sales, understand customer preferences and already using AWS Kinesis Producer Library to collect events and transaction logs and process the stream. The event/log size is around 12 bytes.
HH has the following requirements to process the data that is being ingested -
 Apply transformation of syslog data to CSV format
 Load the data capture, along with other transformations into Redshift
 Capture transformation failures
 Capture delivery failures
 Backup the syslog streaming data into a separate S3 bucket
Select 3 options.

Answer options:

A.Streaming data can be directly loaded into Redshift from Kinesis Firehose
B.Streaming data is delivered to your S3 bucket first. Kinesis Data Firehose then issues an Amazon Redshift COPY command to load data from your S3 bucket to your Amazon Redshift cluster
C.Streaming data is delivered to your S3 bucket first. Kinesis Data Firehose then issues an Amazon Redshift Export command to load data from your S3 bucket to your Amazon Redshift cluster
D.The transformation failures and delivery failures are loaded into processing-failed and errors folders in same S3 bucket
E.The transformation failures and delivery failures are loaded into transform-failed and delivery-failed folders in same S3 bucket
F.when Redshift is selected as destination, and Source record S3 backup is enabled, and Backup S3 Bucket is defined, untransformed incoming data can be delivered to a separate S3 bucket
G.S3 backups can be managed to bucket policies