DRAG DROP - You manage the Microsoft Azure Databricks environment for a company. You must be able to access a private Azure Blob Storage account. Data must be available to all Azure Databricks workspaces. You need to provide the data access. Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. Select and Place: Â Suggested Answer: Step 1: Create a secret scope - Step 2: Add secrets to the scope Note: dbutils.secrets.get(scope = "", key = " ") gets the key that has been stored as a secret in a secret scope. Step 3: Mount the Azure Blob Storage container You can mount a Blob Storage container or a folder inside a container through Databricks File System - DBFS. The mount is a pointer to a Blob Storage container, so the data is never synced locally. Note: To mount a Blob Storage container or a folder inside a container, use the following command: Python - dbutils.fs.mount( source = "wasbs:// @ .blob.core.windows.net", mount_point = "/mnt/ ", extra_configs = {" ":dbutils.secrets.get(scope = " ", key = " ")}) where: dbutils.secrets.get(scope = " ", key = " ") gets the key that has been stored as a secret in a secret scope. References: https://docs.databricks.com/spark/latest/data-sources/azure/azure-storage.html This question is in DP-200 Microsoft Azure Data Engineer Exam For getting Microsoft Certified: Azure Data Engineer Associate Certificate Disclaimers: The website is not related to, affiliated with, endorsed or authorized by Microsoft. The website does not contain actual questions and answers from Microsoft's Certification Exams. Trademarks, certification & product names are used for reference only and belong to Microsoft.
Please login or Register to submit your answer