Free Microsoft DP-300 Exam Questions

Absolute Free DP-300 Exam Practice for Comprehensive Preparation 

  • Microsoft DP-300 Exam Questions
  • Provided By: Microsoft
  • Exam: Administering Microsoft Azure SQL Solutions
  • Certification: Microsoft Azure
  • Total Questions: 462
  • Updated On: Feb 18, 2025
  • Rated: 4.9 |
  • Online Users: 924
Page No. 1 of 93
Add To Cart
  • Question 1
    • You have an Azure virtual machine named VM1 on a virtual network named VNet1. Outbound traffic from VM1 to the internet is blocked.

      You have an Azure SQL database named SqlDb1 on a logical server named SqlSrv1.

      You need to implement connectivity between VM1 and SqlDb1 to meet the following requirements:

      ✑ Ensure that all traffic to the public endpoint of SqlSrv1 is blocked.

      ✑ Minimize the possibility of VM1 exfiltrating data stored in SqlDb1.

      What should you create on VNet1?


      Answer: C
  • Question 2
    • You have an on-premises multi-tier application named App1 that includes a web tier, an application tier, and a Microsoft SQL Server tier. All the tiers run on Hyper-

      V virtual machines.

      Your new disaster recovery plan requires that all business-critical applications can be recovered to Azure.

      You need to recommend a solution to fail over the database tier of App1 to Azure. The solution must provide the ability to test failover to Azure without affecting the current environment.

      What should you include in the recommendation?


      Answer: D
  • Question 3
    • You have SQL Server on an Azure virtual machine that contains a database named DB1. DB1 contains a table named CustomerPII.

      You need to record whenever users query the CustomerPII table.

      Which two options should you enable? Each correct answer presents part of the solution.

      NOTE: Each correct selection is worth one point.


      Answer: B,C
  • Question 4
    • Introductory Info Case study -

      This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.

      To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.

      At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

      To start the case study -

      To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.

      Overview -

      ADatum Corporation is a retailer that sells products through two sales channels: retail stores and a website.

      Existing Environment -

      ADatum has one database server that has Microsoft SQL Server 2016 installed. The server hosts three mission-critical databases named SALESDB, DOCDB, and REPORTINGDB.

      SALESDB collects data from the stores and the website.

      DOCDB stores documents that connect to the sales data in SALESDB. The documents are stored in two different JSON formats based on the sales channel.

      REPORTINGDB stores reporting data and contains several columnstore indexes. A daily process creates reporting data in REPORTINGDB from the data in

      SALESDB. The process is implemented as a SQL Server Integration Services (SSIS) package that runs a stored procedure from SALESDB.

      Requirements -

      Planned Changes -

      ADatum plans to move the current data infrastructure to Azure. The new infrastructure has the following requirements:

      Migrate SALESDB and REPORTINGDB to an Azure SQL database.

      Migrate DOCDB to Azure Cosmos DB.

      The sales data, including the documents in JSON format, must be gathered as it arrives and analyzed online by using Azure Stream Analytics. The analytics process will perform aggregations that must be done continuously, without gaps, and without overlapping.

      As they arrive, all the sales documents in JSON format must be transformed into one consistent format.

      Azure Data Factory will replace the SSIS process of copying the data from SALESDB to REPORTINGDB.

      Technical Requirements -

      The new Azure data infrastructure must meet the following technical requirements:

      Data in SALESDB must encrypted by using Transparent Data Encryption (TDE). The encryption must use your own key.

      SALESDB must be restorable to any given minute within the past three weeks.

      Real-time processing must be monitored to ensure that workloads are sized properly based on actual usage patterns.

      Missing indexes must be created automatically for REPORTINGDB.

      Disk IO, CPU, and memory usage must be monitored for SALESDB. Question Which windowing function should you use to perform the streaming aggregation of the sales data?


      Answer: D
  • Question 5
    • You want to deploy multiple Azure resources as a unit. Which of the following would you primarily use to define and deploy those resources? (Select the best answer)

      Answer: C
PAGE: 1 - 93
Add To Cart

© Copyrights DumpsEngine 2025. All Rights Reserved

We use cookies to ensure your best experience. So we hope you are happy to receive all cookies on the DumpsEngine.