You have an Azure data solution that contains an enterprise data warehouse in Azure Synapse Analytics named DW1. Several users execute adhoc queries to DW1 concurrently. You regularly perform automated data loads to DW1. You need to ensure that the automated data loads have enough memory available to complete quickly and successfully when the adhoc queries run. What should you do? A. Hash distribute the large fact tables in DW1 before performing the automated data loads. B. Assign a larger resource class to the automated data load queries. C. Create sampled statistics for every column in each table of DW1. D. Assign a smaller resource class to the automated data load queries. Â Suggested Answer: B To ensure the loading user has enough memory to achieve maximum compression rates, use loading users that are a member of a medium or large resource class. Reference: https://docs.microsoft.com/en-us/azure/sql-data-warehouse/guidance-for-loading-data This question is in DP-200 Microsoft Azure Data Engineer Exam For getting Microsoft Certified: Azure Data Engineer Associate Certificate Disclaimers: The website is not related to, affiliated with, endorsed or authorized by Microsoft. The website does not contain actual questions and answers from Microsoft's Certification Exams. Trademarks, certification & product names are used for reference only and belong to Microsoft.
Please login or Register to submit your answer