(Mar-2020) AWS Certified Solutions Architect - Professional (SAP-C01) Exam Dumps

Download as pdf or txt
Download as pdf or txt
You are on page 1of 3

New VCE and PDF Exam Dumps from PassLeader

➢ Vendor: Amazon

➢ Exam Code: SAP-C01

➢ Exam Name: AWS Certified Solutions Architect - Professional

➢ New Questions (Mar/2020)

Visit PassLeader and Download Full Version AWS-Professional Exam Dumps

NEW QUESTION 1
Your department creates regular analytics reports from your company's log files. All log data is collected in Amazon S3 and processed
by daily Amazon Elastic MapReduce (EMR) jobs that generate daily PDF reports and aggregated tables in .csv format for an Amazon
Redshift data warehouse. Your CFO requests that you optimize the cost structure for this system. Which of the following alternatives will
lower costs without compromising average performance of the system or data integrity for the raw data?

A. Use reduced redundancy storage (RRS) for all data in S3.


Use a combination of Spot Instances and Reserved Instances for Amazon EMR jobs.
Use Reserved Instances for Amazon Redshift.
B. Use reduced redundancy storage (RRS) for PDF and .csv data in S3.
Add Spot Instances to EMR jobs.
Use Spot Instances for Amazon Redshift.
C. Use reduced redundancy storage (RRS) for PDF and .csv data in Amazon S3.
Add Spot Instances to Amazon EMR jobs.
Use Reserved Instances for Amazon Redshift.
D. Use reduced redundancy storage (RRS) for all data in Amazon S3.
Add Spot Instances to Amazon EMR jobs.
Use Reserved Instances for Amazon Redshift.

Answer: A

NEW QUESTION 2
Your website is serving on-demand training videos to your workforce. Videos are uploaded monthly in high resolution MP4 format. Your
workforce is distributed globally, often on the move and using company-provided tablets that require the HTTP Live Streaming (HLS)
protocol to watch a video. Your company has no video transcoding expertise and If required you may need to pay for a consultant. How
do you implement the most cost-efficient architecture without compromising high availability and quality of video delivery?

A. A video transcoding pipeline running on EC2 using SQS to distribute tasks and Auto Scaling to adjust the number of nodes depending on
the length of the queue.
EBS volumes to host videos and EBS snapshots to incrementally backup original files after a few days.
CloudFront to serve HLS transcoded videos from EC2.
B. Elastic Transcoder to transcode original high-resolution MP4 videos to HLS.
EBS volumes to host videos and EBS snapshots to incrementally backup original files after a few days.
CloudFront to serve HLS transcoded videos from EC2.
C. Elastic Transcoder to transcode original high-resolution MP4 videos to HLS.
S3 to host videos with Lifecycle Management to archive original files to Glacier after a few days.
CloudFront to serve HLS transcoded videos from S3.

AWS-Professional Exam Dumps AWS-Professional Exam Questions AWS-Professional PDF Dumps AWS-Professional VCE Dumps
https://www.passleader.com/aws-certified-solutions-architect-professional.html
New VCE and PDF Exam Dumps from PassLeader
D. A video transcoding pipeline running on EC2 using SQS to distribute tasks and Auto Scaling to adjust the number of nodes depending on
the length of the queue.
S3 to host videos with Lifecycle Management to archive all files to Glacier after a few days.
CloudFront to serve HLS transcoded videos from Glacier.

Answer: A

NEW QUESTION 3
You deployed your company website using Elastic Beanstalk and you enabled log file rotation to S3. An Elastic MapReduce Job is
periodically analyzing the logs on S3 to build a usage dashboard that you share with your CIO. You recently improved overall performance
of the website using CloudFront for dynamic content delivery and your website as the origin. After this architectural change, the usage
dashboard shows that the traffic on your website dropped by an order of magnitude. How do you fix your usage dashboard?

A. Change your log collection process to use CloudWatch ELB metrics as input of the Elastic MapReduce job.
B. Turn on CloudTrail and use trail log files on S3 as input of the Elastic MapReduce job.
C. Enable CloudFront to deliver access logs to S3 and use them as input of the Elastic MapReduce job.
D. Use Elastic Beanstalk "Restart App Server(s)" option to update log delivery to the Elastic MapReduce job.
E. Use Elastic Beanstalk "Rebuild Environment" option to update log delivery to the Elastic MapReduce job.

Answer: D

NEW QUESTION 4
A web company is looking to implement an intrusion detection and prevention system into their deployed VPC. This platform should have
the ability to scale to thousands of instances running inside of the VPC. How should they architect their solution to achieve these goals?

A. Configure each host with an agent that collects all network traffic and sends that traffic to the IDS/IPS platform for inspection.
B. Configure an instance with monitoring software and the elastic network interface (ENI) set to promiscuous mode packet sniffing to see all
traffic across the VPC.
C. Create a second VPC and route all traffic from the primary application VPC through the second VPC where the scalable virtualized IDS/IPS
platform resides.
D. Configure servers running in the VPC using the host-based "route" commands to send all traffic through the platform to a scalable virtualized
IDS/IPS.

Answer: D

NEW QUESTION 5
You are running a successful multitier web application on AWS and your marketing department has asked you to add a reporting tier to
the application. The reporting tier will aggregate and publish status reports every 30 minutes from user-generated information that is being
stored in your web application's database. You are currently running a Multi-AZ RDS MySQL instance for the database tier. You also have
implemented ElastiCache as a database caching layer between the application tier and database tier. Please select the answer that will
allow you to successfully implement the reporting tier with as little impact as possible to your database.

A. Launch a RDS Read Replica connected to your Multi AZ master database and generate reports by querying the Read Replica.
B. Continually send transaction logs from your master database to an S3 bucket and generate the reports off the S3 bucket using S3 byte
range requests.
C. Generate the reports by querying the ElastiCache database caching tier.
D. Generate the reports by querying the synchronously replicated standby RDS MySQL instance maintained through Multi-AZ.

Answer: B

NEW QUESTION 6
Your firm has uploaded a large amount of aerial image data to S3. In the past, in you on premises environment, you used a dedicated
group of servers to batch process this data and used RabbitMQ, an open source messaging system, to get job information to the servers.
Once processed the data would go to tape and be shipped offsite. Your manager told you to stay with the current design, and leverage
AWS archival storage and messaging services to minimize cost. Which is correct?

AWS-Professional Exam Dumps AWS-Professional Exam Questions AWS-Professional PDF Dumps AWS-Professional VCE Dumps
https://www.passleader.com/aws-certified-solutions-architect-professional.html
New VCE and PDF Exam Dumps from PassLeader

A. Use SNS to pass job messages, use CloudWatch alarms to terminate spot worker instances when they become idle. Once data is processed,
change the storage class of the S3 object to Glacier.
B. Use SQS for passing job messages, use CloudWatch alarms to terminate EC2 worker instances when they become idle. Once data is
processed, change the storage class of the S3 objects to Reduced Redundancy Storage.
C. Setup Auto-Scaled workers triggered by queue depth that use spot instances to process messages in SQS. Once data is processed, change
the storage class of the S3 objects to Reduced Redundancy Storage.
D. Setup Auto-Scaled workers triggered by queue depth that use spot instances to process messages in SQS. Once data is processed, change
the storage class of the S3 objects to Glacier.

Answer: A

NEW QUESTION 7
You require the ability to analyze a large amount of data which is stored on Amazon S3 using Amazon Elastic MapReduce. You are using
the cc2.8xlarge instance type, whose CPUs are mostly idle during processing. Which of the below would be the most cost efficient way
to reduce the runtime of the job?

A. Create fewer, larger files m Amazon S3.


B. Use smaller instances that have higher aggregate I/O performance.
C. Create more, smaller files on Amazon S3.
D. Add additional cc2.8xlarge instances by introducing a task group.

Answer: B

NEW QUESTION 8
You are the new IT architect in a company that operates a mobile sleep tracking application. When activated at night, the mobile app is
sending collected data points of 1 kilobyte every 5 minutes to your backend. The backend takes care of authenticating the user and
writing the data points into an Amazon DynamoDB table. Every morning, you scan the table to extract and aggregate last night's data on
a per user basis, and store the results in Amazon S3. Users are notified via Amazon SNS mobile push notifications that new data is
available, which is parsed and visualized by the mobile app. Currently you have around 100k users who are mostly based out of North
America. You have been tasked to optimize the architecture of the backend system to lower cost. What would you recommend? Choose
2 answers.

A. Have the mobile app access Amazon DynamoDB directly Instead of JSON files stored on Amazon S3.
B. Write data directly into an Amazon Redshift cluster replacing both Amazon DynamoDB and Amazon S3.
C. Introduce an Amazon SQS queue to buffer writes to the Amazon DynamoDB table and reduce provisioned write throughput.
D. Introduce Amazon Elasticache to cache reads from the Amazon DynamoDB table and reduce provisioned read throughput.
E. Create a new Amazon DynamoDB table each day and drop the one for the previous day after its data is on Amazon S3.

Answer: AD

NEW QUESTION 9
……

Visit PassLeader and Download Full Version AWS-Professional Exam Dumps

AWS-Professional Exam Dumps AWS-Professional Exam Questions AWS-Professional PDF Dumps AWS-Professional VCE Dumps
https://www.passleader.com/aws-certified-solutions-architect-professional.html

You might also like