Question 397:
Your company recently got a contract to help optimize an IoT based application. The application consists of thousands of on-field monitoring devices that send the bulk of data daily via Kinesis. Once the data is processed, the snapshot is saved into S3 for later processing. There is a deep learning application that runs once a day and consumes these snapshots of data from S3 and generates the forecast reports for the management. The snapshots of data in S3 are no longer used after a month. The original team optimizes the overall process well and has selected different storage options for different kinds of data. An expansion plan put forward by the management will increase the devices by 100 folds and thus has suggested pinpointing the areas which can be optimized for such heavy load. Please select the cost-effective option to support the scalability and maintain the durability of the data.
Answer options:
A.Shift the snapshots from S3 to the Redshift cluster. This will help to scale based on the additional load and take periodic backups of Redshift to Glacier. B.Use the S3 standard option to store snapshots. After the processing is complete, change the storage class to Glacier after a month. C.Use the S3 Infrequent Access storage class to save the snapshots. Use a lifecycle rule to migrate the snapshots to Glacier D.Save the snapshots with the Glacier class and use the low-cost bulk retrieval option to fetch required snapshots for the deep learning program. E.Migrate the snapshots to EFS attached to the machine where the deep learning program is running. Once the program completes the processing, migrate the snapshots to the Glacier.