New Year Sale Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: pass65

DP-700 Implementing Data Engineering Solutions Using Microsoft Fabric Questions and Answers

Questions 4

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You have a KQL database that contains two tables named Stream and Reference. Stream contains streaming data in the following format.

DP-700 Question 4

Reference contains reference data in the following format.

DP-700 Question 4

Both tables contain millions of rows.

You have the following KQL queryset.

DP-700 Question 4

You need to reduce how long it takes to run the KQL queryset.

Solution: You change project to extend.

Does this meet the goal?

Options:

A.

Yes

B.

No

Buy Now
Questions 5

You have the development groups shown in the following table.

DP-700 Question 5

You have the projects shown in the following table.

DP-700 Question 5

You need to recommend which Fabric item to use based on each development group's skillset The solution must meet the project requirements and minimize development effort

What should you recommend for each group? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

DP-700 Question 5

Options:

Buy Now
Questions 6

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You have a Fabric eventstream that loads data into a table named Bike_Location in a KQL database. The table contains the following columns:

BikepointID

Street

Neighbourhood

No_Bikes

No_Empty_Docks

Timestamp

You need to apply transformation and filter logic to prepare the data for consumption. The solution must return data for a neighbourhood named Sands End when No_Bikes is at least 15. The results must be ordered by No_Bikes in ascending order.

Solution: You use the following code segment:

DP-700 Question 6

Does this meet the goal?

Options:

A.

Yes

B.

no

Buy Now
Questions 7

You need to create the product dimension.

How should you complete the Apache Spark SQL code? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

DP-700 Question 7

Options:

Buy Now
Questions 8

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You have a Fabric eventstream that loads data into a table named Bike_Location in a KQL database. The table contains the following columns:

BikepointID

Street

Neighbourhood

No_Bikes

No_Empty_Docks

Timestamp

You need to apply transformation and filter logic to prepare the data for consumption. The solution must return data for a neighbourhood named Sands End when No_Bikes is at least 15. The results must be ordered by No_Bikes in ascending order.

Solution: You use the following code segment:

DP-700 Question 8

Does this meet the goal?

Options:

A.

Yes

B.

no

Buy Now
Questions 9

You need to populate the MAR1 data in the bronze layer.

Which two types of activities should you include in the pipeline? Each correct answer presents part of the solution.

NOTE: Each correct selection is worth one point.

Options:

A.

ForEach

B.

Copy data

C.

WebHook

D.

Stored procedure

Buy Now
Questions 10

You need to ensure that usage of the data in the Amazon S3 bucket meets the technical requirements.

What should you do?

Options:

A.

Create a workspace identity and enable high concurrency for the notebooks.

B.

Create a shortcut and ensure that caching is disabled for the workspace.

C.

Create a workspace identity and use the identity in a data pipeline.

D.

Create a shortcut and ensure that caching is enabled for the workspace.

Buy Now
Questions 11

You need to schedule the population of the medallion layers to meet the technical requirements.

What should you do?

Options:

A.

Schedule a data pipeline that calls other data pipelines.

B.

Schedule a notebook.

C.

Schedule an Apache Spark job.

D.

Schedule multiple data pipelines.

Buy Now
Questions 12

You need to ensure that the authors can see only their respective sales data.

How should you complete the statement? To answer, drag the appropriate values the correct targets. Each value may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content

NOTE: Each correct selection is worth one point.

DP-700 Question 12

Options:

Buy Now
Questions 13

You have a Fabric workspace named Workspace1.

You plan to configure Git integration for Workspace1 by using an Azure DevOps Git repository. An Azure DevOps admin creates the required artifacts to support the integration of Workspace1 Which details do you require to perform the integration?

Options:

A.

the project, Git repository, branch, and Git folder

B.

the organization, project. Git repository, and branch

C.

the Git repository URL and the Git folder

D.

the personal access token (PAT) for Git authentication and the Git repository URL

Buy Now
Questions 14

You are building a data orchestration pattern by using a Fabric data pipeline named Dynamic Data Copy as shown in the exhibit. (Click the Exhibit tab.)

DP-700 Question 14

Dynamic Data Copy does NOT use parametrization.

You need to configure the ForEach activity to receive the list of tables to be copied.

How should you complete the pipeline expression? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

DP-700 Question 14

Options:

Buy Now
Questions 15

You need to recommend a Fabric streaming solution that will use the sources shown in the following table.

DP-700 Question 15

The solution must minimize development effort.

What should you include in the recommendation for each source? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

DP-700 Question 15

Options:

Buy Now
Questions 16

You have a Fabric workspace that contains a takehouse and a semantic model named Model1.

You use a notebook named Notebook1 to ingest and transform data from an external data source.

You need to execute Notebook1 as part of a data pipeline named Pipeline1. The process must meet the following requirements:

• Run daily at 07:00 AM UTC.

• Attempt to retry Notebook1 twice if the notebook fails.

• After Notebook1 executes successfully, refresh Model1.

Which three actions should you perform? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

Options:

A.

Set the Retry setting of the Notebook activity to 2.

B.

Place the Semantic model refresh activity after the Notebook activity and link the activities by using an On completion condition.

C.

Place the Semantic model refresh activity after the Notebook activity and link the activities by using the On success condition.

D.

From the Schedule settings of Notebook1, set the time zone to UTC.

E.

From the Schedule settings of Pipeline1, set the time zone to UTC.

F.

Set the Retry setting of the Semantic model refresh activity to 2.

Buy Now
Questions 17

You have an Azure SQL database named DB1.

In a Fabric workspace, you deploy an eventstream named EventStreamDBI to stream record changes from DB1 into a lakehouse.

You discover that events are NOT being propagated to EventStreamDBI.

You need to ensure that the events are propagated to EventStreamDBI.

What should you do?

Options:

A.

Create a read-only replica of DB1.

B.

Create an Azure Stream Analytics job.

C.

Enable Extended Events for DB1.

D.

Enable change data capture (CDC) for DB1.

Buy Now
Questions 18

You need to recommend a solution to resolve the MAR1 connectivity issues. The solution must minimize development effort. What should you recommend?

Options:

A.

Add a ForEach activity to the data pipeline.

B.

Configure retries for the Copy data activity.

C.

Configure Fault tolerance for the Copy data activity.

D.

Call a notebook from the data pipeline.

Buy Now
Questions 19

You need to recommend a method to populate the POS1 data to the lakehouse medallion layers.

What should you recommend for each layer? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

DP-700 Question 19

Options:

Buy Now
Questions 20

You need to ensure that the data analysts can access the gold layer lakehouse.

What should you do?

Options:

A.

Add the DataAnalyst group to the Viewer role for WorkspaceA.

B.

Share the lakehouse with the DataAnalysts group and grant the Build reports on the default semantic model permission.

C.

Share the lakehouse with the DataAnalysts group and grant the Read all SQL Endpoint data permission.

D.

Share the lakehouse with the DataAnalysts group and grant the Read all Apache Spark permission.

Buy Now
Questions 21

You need to recommend a solution for handling old files. The solution must meet the technical requirements. What should you include in the recommendation?

Options:

A.

a data pipeline that includes a Copy data activity

B.

a notebook that runs the VACUUM command

C.

a notebook that runs the OPTIMIZE command

D.

a data pipeline that includes a Delete data activity

Buy Now
Questions 22

You need to ensure that the data engineers are notified if any step in populating the lakehouses fails. The solution must meet the technical requirements and minimize development effort.

What should you use? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

DP-700 Question 22

Options:

Buy Now
Exam Code: DP-700
Exam Name: Implementing Data Engineering Solutions Using Microsoft Fabric
Last Update: Dec 11, 2025
Questions: 109

PDF + Testing Engine

$65.27  $186.49

Testing Engine

$52.32  $149.49
buy now DP-700 testing engine

PDF (Q&A)

$48.12  $137.49
buy now DP-700 pdf