Amazon SAP-C02 Real Exam Questions
The questions for SAP-C02 were last updated at Nov 21,2024.
- Exam Code: SAP-C02
- Exam Name: AWS Certified Solutions Architect - Professional
- Certification Provider: Amazon
- Latest update: Nov 21,2024
A company is running a tone-of-business (LOB) application on AWS to support its users. The application runs in one VPC. with a backup copy in a second VPC in a different AWS Region for disaster recovery. The company has a single AWS Direct Connect connection between its on-premises network and AWS. The connection terminates at a Direct Connect gateway
All access to the application must originate from the company’s on-premises network, and traffic must be encrypted in transit through the use of Psec. The company is routing traffic through a VPN tunnel over the Direct Connect connection to provide the required encryption.
A business continuity audit determines that the Direct Connect connection represents a potential single point of failure for access to the application. The company needs to remediate this issue as quickly as possible.
Which approach will meet these requirements?
- A . Order a second Direct Connect connection to a different Direct Connect location. Terminate the second Direct Connect connection at the same Direct Connect gateway.
- B . Configure an AWS Site-to-Site VPN connection over the internet Terminate the VPN connection at a virtual private gateway in the secondary Region
- C . Create a transit gateway Attach the VPCs to the transit gateway, and connect the transit gateway to the Direct Connect gateway Configure an AWS Site-to-Site VPN connection, and terminate it at the transit gateway
- D . Create a transit gateway. Attach the VPCs to the transit gateway, and connect the transit gateway to the Direct Connect gateway. Order a second Direct Connect connection, and terminate it at the transit gateway.
A company has an internal application running on AWS that is used to track and process shipments in the company’s warehouse. Currently, after the system receives an order, it emails the staff the information needed to ship a package. Once the package is shipped, the staff replies to the email and the order is marked as shipped.
The company wants to stop using email in the application and move to a serverless application model.
Which architecture solution meets these requirements?
- A . Use AWS Batch to configure the different tasks required lo ship a package. Have AWS Batch trigger an AWS Lambda function that creates and prints a shipping label. Once that label is scanned. as it leaves the warehouse, have another Lambda function move the process to the next step in the AWS Batch job.B.
- B . When a new order is created, store the order information in Amazon SQS. Have AWS Lambda check the queue every 5 minutes and process any needed work. When an order needs to be shipped, have Lambda print the label in the warehouse. Once the label has been scanned, as it leaves the warehouse, have an Amazon EC2 instance update Amazon SOS.
- C . Update the application to store new order information in Amazon DynamoDB. When a new order is created, trigger an AWS Step Functions workflow, mark the orders as "in progress," and print a package label to the warehouse. Once the label has been scanned and fulfilled, the application will trigger an AWS Lambda function that will mark the order as shipped and complete the workflow.
- D . Store new order information in Amazon EFS. Have instances pull the new information from the NFS and send that information to printers in the warehouse. Once the label has been scanned, as it leaves the warehouse, have Amazon API Gateway call the instances to remove the order information from Amazon EFS.
A company is launching a new web application on Amazon EC2 instances. Development and production workloads exist in separate AWS accounts.
According to the company’s security requirements, only automated configuration tools are allowed to access the production account. The company’s security team wants to receive immediate notification if any manual access to the production AWS account or EC2 instances occurs
Which combination of actions should a solutions architect take in the production account to meet these requirements? (Select THREE.)
- A . Turn on AWS CloudTrail logs in the application’s primary AWS Region Use Amazon Athena to queiy the logs for AwsConsoleSignln events.
- B . Configure Amazon Simple Email Service (Amazon SES) to send email to the security team when an alarm is activated.
- C . Deploy EC2 instances in an Auto Scaling group Configure the launch template to deploy instances without key pairs Configure Amazon CloudWatch Logs to capture system access logs Create an Amazon CloudWatch alarm that is based on the logs to detect when a user logs in to an EC2 instance
- D . Configure an Amazon Simple Notification Service (Amazon SNS) topic to send a message to the security team when an alarm is activated
- E . Turn on AWS CloudTrail logs for all AWS Regions. Configure Amazon CloudWatch alarms to provide an alert when an AwsConsoleSignin event is detected.
- F . Deploy EC2 instances in an Auto Scaling group. Configure the launch template to delete the key pair after launch. Configure Amazon CloudWatch Logs for the system access logs Create an Amazon CloudWatch dashboard to show user logins over time.
A solutions architect is designing a network for a new cloud deployment. Each account will need autonomy to modify route tables and make changes. Centralized and controlled egress internet connectivity is also needed. The cloud footprint is expected to grow to thousands of AWS accounts.
Which architecture will meet these requirements?
- A . A centralized transit VPC with a VPN connection to a standalone VPC in each account.
Outbound internet traffic will be controlled by firewall appliances. - B . A centralized shared VPC with a subnet for each account. Outbound internet traffic will controlled through a fleet of proxy servers.
- C . A shared services VPC to host central assets to include a fleet of firewalls with a route to the internet. Each spoke VPC will peer to the central VPC.
- D . A shared transit gateway to which each VPC will be attached. Outbound internet access will route through a fleet of VPN-attached firewalls.
An e-commerce company is revamping its IT infrastructure and is planning to use AWS services. The company’s CIO has asked a solutions architect to design a simple, highly available, and loosely coupled order processing application. The application is responsible (or receiving and processing orders before storing them in an Amazon DynamoDB table. The application has a sporadic traffic pattern and should be able to scale during markeling campaigns to process the orders with minimal delays.
Which of the following is the MOST reliable approach to meet the requirements?
- A . Receive the orders in an Amazon EC2-hosted database and use EC2 instances to process them.
- B . Receive the orders in an Amazon SOS queue and trigger an AWS Lambda function lo process them.
- C . Receive the orders using the AWS Step Functions program and trigger an Amazon ECS container lo process them.
- D . Receive the orders in Amazon Kinesis Data Streams and use Amazon EC2 instances to process them.
An e-commerce company is revamping its IT infrastructure and is planning to use AWS services. The company’s CIO has asked a solutions architect to design a simple, highly available, and loosely coupled order processing application. The application is responsible (or receiving and processing orders before storing them in an Amazon DynamoDB table. The application has a sporadic traffic pattern and should be able to scale during markeling campaigns to process the orders with minimal delays.
Which of the following is the MOST reliable approach to meet the requirements?
- A . Receive the orders in an Amazon EC2-hosted database and use EC2 instances to process them.
- B . Receive the orders in an Amazon SOS queue and trigger an AWS Lambda function lo process them.
- C . Receive the orders using the AWS Step Functions program and trigger an Amazon ECS container lo process them.
- D . Receive the orders in Amazon Kinesis Data Streams and use Amazon EC2 instances to process them.
A company is running a data-intensive application on AWS. The application runs on a cluster of hundreds of Amazon EC2 instances. A shared file system also runs on several EC2 instances that store 200 TB of data. The application reads and modifies the data on the shared file system and generates a report. The job runs once monthly, reads a subset of the files from the shared file system, and takes about 72 hours to complete. The compute instances scale in an Auto Scaling group, but the instances that host the shared file system run continuously. The compute and storage instances are all in the same AWS Region.
A solutions architect needs to reduce costs by replacing the shared file system instances. The file system must provide high performance access to the needed data for the duration of the 72-hour run.
Which solution will provide the LARGEST overall cost reduction while meeting these requirements?
- A . Migrate the data from the existing shared file system to an Amazon S3 bucket that uses the S3 Intelligent-Tiering storage class. Before the job runs each month, use Amazon FSx for Lustre to create a new file system with the data from Amazon S3 by using lazy loading. Use the new file system as the shared storage for the duration of the job. Delete the file system when the job is complete.
- B . Migrate the data from the existing shared file system to a large Amazon Elastic Block Store (Amazon EBS) volume with Multi-Attach enabled. Attach the EBS volume to each of the instances by using a user data script in the Auto Scaling group launch template. Use the EBS volume as the shared storage for the duration of the job. Detach the EBS volume when the job is complete.
- C . Migrate the data from the existing shared file system to an Amazon S3 bucket that uses the S3 Standard storage class. Before the job runs each month, use Amazon FSx for Lustre to create a new file system with the data from Amazon S3 by using batch loading. Use the new file system as the shared storage for the duration of the job. Delete the file system when the job is complete.
- D . Migrate the data from the existing shared file system to an Amazon S3 bucket. Before the job runs each month, use AWS Storage Gateway to create a file gateway with the data from Amazon S3. Use the file gateway as the shared storage for the job. Delete the file gateway when the job is complete.
A group of research institutions and hospitals are in a partnership to study 2 PBs of genomic data. The institute that owns the data stores it in an Amazon S3 bucket and updates it regularly. The institute would like to give all of the organizations in the partnership read access to the data. All members of the partnership are extremety cost-conscious, and the institute that owns the account with the S3 bucket is concerned about covering the costs tor requests and data transfers from Amazon S3.
Which solution allows for secure datasharing without causing the institute that owns the bucket to assume all the costs for S3 requests and data transfers’?
- A . Ensure that all organizations in the partnership have AWS accounts. In the account with the S3 bucket, create a cross-account role for each account in the partnership that allows read access to the data. Have the organizations assume and use that read role when accessing the data.
- B . Ensure that all organizations in the partnership have AWS accounts. Create a bucket policy on the bucket that owns the data. The policy should allow the accounts in the partnership read access to the bucket. Enable Requester Pays on the bucket. Have the organizations use their AWS credentials when accessing the data.
- C . Ensure that all organizations in the partnership have AWS accounts. Configure buckets in each of the accounts with a bucket policy that allows the institute that owns the data the ability to write to the bucket Periodically sync the data from the institute’s account to the other organizations. Have the organizations use their AWS credentials when accessing the data using their accounts
- D . Ensure that all organizations in the partnership have AWS accounts. In the account with the S3 bucket, create a cross-account role for each account in the partnership that allows read access to the data. Enable Requester Pays on the bucket. Have the organizations assume and use that read role when accessing the data.
A company has a project that is launching Amazon EC2 instances that are larger than required. The project’s account cannot be part of the company’s organization in AWS Organizations due to policy restrictions to keep this activity outside of corporate IT. The company wants to allow only the launch of t3.small EC2 instances by developers in the project’s account. These EC2 instances must be restricted to the us-east-2 Region.
What should a solutions architect do to meet these requirements?
- A . Create a new developer account. Move all EC2 instances, users, and assets into us-east-2. Add the account to the company’s organization in AWS Organizations. Enforce a tagging policy that denotes Region affinity.
- B . Create an SCP that denies the launch of all EC2 instances except I3.small EC2 instances in us-east-2. Attach the SCP to the project’s account.
- C . Create and purchase a t3.small EC2 Reserved Instance for each developer in us-east-2.
Assign each developer a specific EC2 instance with their name as the tag. - D . Create an IAM policy than allows the launch of only t3.small EC2 instances in us-east-2.
Attach the policy to the roles and groups that the developers use in the project’s account.
A company has a data lake in Amazon S3 that needs to be accessed by hundreds of applications across many AWS accounts. The company’s information security policy states that the S3 bucket must not be accessed over the public internet and that each application should have the minimum permissions necessary to function.
To meet these requirements, a solutions architect plans to use an S3 access point that is restricted to specific VPCs tor each application.
Which combination of steps should the solutions architect take to implement this solution? (Select TWO.)
- A . Create an S3 access point for each application in the AWS account that owns the S3 bucket. Configure each access point to be accessible only from the application’s VPC. Update the bucket policy to require access from an access point.
- B . Create an interface endpoint for Amazon S3 in each application’s VPC. Configure the endpoint policy to allow access to an S3 access point. Create a VPC gateway attachment for the S3 endpoint.
- C . Create a gateway endpoint lor Amazon S3 in each application’s VPC. Configure the endpoint policy to allow access to an S3 access point. Specify the route table that is used to access the access point.
- D . Create an S3 access point for each application in each AWS account and attach the access points to the S3 bucket. Configure each access point to be accessible only from the application’s VPC. Update the bucket policy to require access from an access point.
- E . Create a gateway endpoint for Amazon S3 in the data lake’s VPC. Attach an endpoint
policy to allow access to the S3 bucket. Specify the route table that is used to access the bucket.