Question 115:
You work on an application development team for a financial services firm. You and your team are working on a mission-critical project with a very aggressive timeline for implementation. For this project, you are building a machine learning model to predict customer retention where you are using customer PII (Personal Identifiable Information) data. This data is very sensitive and is also controlled by SEC (Securities Exchange Commission) compliance regulations. Therefore, your data ingestion process and data storage must be highly secure. For this reason, you have a mandate to use encryption for all data storage. How do you use SageMaker features to ensure all of your model artifacts are highly secure with the least amount of effort on your team’s part?
Answer options:
A.Use SSL to encrypt your data on your S3 bucket (where you store your model artifacts and data) and your SageMaker jupyter notebooks. Then run your SageMaker training jobs, hyperparameter tuning jobs, batch transform jobs, and your inference endpoint using the default SageMaker IAM roles and policies. B.Use SageMaker Neo, which encrypts your data at rest in your S3 bucket, where you store your model artifacts and data. Then pass an AWS Key Management Service key to your SageMaker jupyter notebooks, training jobs, hyperparameter tuning jobs, batch transform jobs, and your inference endpoint to encrypt the S3 bucket. C.Use encrypted S3 buckets for your model artifacts and data. Then pass an AWS Key Management Service key to your SageMaker jupyter notebooks, training jobs, hyperparameter tuning jobs, batch transform jobs, and your inference endpoint to encrypt the attached machine learning storage volume. D.Use your customer-owned AWS Key Management Service key to store your data on the ML EBS volume or in your S3 buckets, which you encrypt using your customer-owned Key Management Service key. Pass your customer-owned Key Management Service key to your SageMaker jupyter notebooks, training jobs, hyperparameter tuning jobs, batch transform jobs, and your inference endpoint to encrypt the attached machine learning storage volume.