Winter Special Sale Limited Time 60% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: 713PS592

DP-600 Implementing Analytics Solutions Using Microsoft Fabric Questions and Answers

Questions 4

What should you use to implement calculation groups for the Research division semantic models?

Options:

A.

DAX Studio

B.

Microsoft Power Bl Desktop

C.

the Power Bl service

D.

Tabular Editor

Buy Now
Questions 5

You need to ensure that Contoso can use version control to meet the data analytics requirements and the general requirements. What should you do?

Options:

A.

Store all the semantic models and reports in Data Lake Gen2 storage.

B.

Modify the settings of the Research workspaces to use a GitHub repository.

C.

Store all the semantic models and reports in Microsoft OneDrive.

D.

Modify the settings of the Research division workspaces to use an Azure Repos repository.

Buy Now
Questions 6

You have a Fabric tenant tha1 contains a takehouse named Lakehouse1. Lakehouse1 contains a Delta table named Customer.

When you query Customer, you discover that the query is slow to execute. You suspect that maintenance was NOT performed on the table.

You need to identify whether maintenance tasks were performed on Customer.

Solution: You run the following Spark SQL statement:

REFRESH TABLE customer

Does this meet the goal?

Options:

A.

Yes

B.

No

Buy Now
Questions 7

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You have a Fabric tenant that contains a semantic model named Model1.

You discover that the following query performs slowly against Model1.

DP-600 Question 7

You need to reduce the execution time of the query.

Solution: You replace line 4 by using the following code:

DP-600 Question 7

Does this meet the goal?

Options:

A.

Yes

B.

No

Buy Now
Questions 8

You have an Azure Data Lake Storage Gen2 account named storage! that contains a Parquet file named sales.parquet.

You have a Fabric tenant that contains a workspace named Workspace1.

Using a notebook in Workspace1, you need to load the content of the file to the default lakehouse. The solution must ensure that the content will display automatically as a table named Sales in Lakehouse explorer.

How should you complete the code? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

DP-600 Question 8

Options:

Buy Now
Questions 9

You have a Fabric tenant that contains a new semantic model in OneLake.

You use a Fabric notebook to read the data into a Spark DataFrame.

You need to evaluate the data to calculate the min, max, mean, and standard deviation values for all the string and numeric columns.

Solution: You use the following PySpark expression:

df.explain()

Does this meet the goal?

Options:

A.

Yes

B.

No

Buy Now
Questions 10

You need to provide Power Bl developers with access to the pipeline. The solution must meet the following requirements:

• Ensure that the developers can deploy items to the workspaces for Development and Test.

• Prevent the developers from deploying items to the workspace for Production.

• Follow the principle of least privilege.

Which three levels of access should you assign to the developers? Each correct answer presents part of the solution. NOTE: Each correct answer is worth one point.

Options:

A.

Build permission to the production semantic models

B.

Admin access to the deployment pipeline

C.

Viewer access to the Development and Test workspaces

D.

Viewer access to the Production workspace

E.

Contributor access to the Development and Test workspaces

F.

Contributor access to the Production workspace

Buy Now
Questions 11

You have a Fabric tenant that contains a semantic model named Model1. Model1 uses Import mode. Model1 contains a table named Orders. Orders has 100 million rows and the following fields.

DP-600 Question 11

You need to reduce the memory used by Model! and the time it takes to refresh the model. Which two actions should you perform? Each correct answer presents part of the solution. NOTE: Each correct answer is worth one point.

Options:

A.

Split OrderDateTime into separate date and time columns.

B.

Replace TotalQuantity with a calculated column.

C.

Convert Quantity into the Text data type.

D.

Replace TotalSalesAmount with a measure.

Buy Now
Questions 12

You have a Fabric tenant that contains a lakehouse named Lakehouse1. Lakehouse1 contains a table named Nyctaxi_raw. Nyctaxi_raw contains the following columns.

DP-600 Question 12

You create a Fabric notebook and attach it to lakehouse1.

You need to use PySpark code to transform the data. The solution must meet the following requirements:

• Add a column named pickupDate that will contain only the date portion of pickupDateTime.

• Filter the DataFrame to include only rows where fareAmount is a positive number that is less than 100.

How should you complete the code? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

DP-600 Question 12

Options:

Buy Now
Questions 13

You have a Fabric tenant that uses a Microsoft tower Bl Premium capacity. You need to enable scale-out for a semantic model. What should you do first?

Options:

A.

At the semantic model level, set Large dataset storage format to Off.

B.

At the tenant level, set Create and use Metrics to Enabled.

C.

At the semantic model level, set Large dataset storage format to On.

D.

At the tenant level, set Data Activator to Enabled.

Buy Now
Questions 14

You have a Fabric tenant that contains a semantic model. The model uses Direct Lake mode.

You suspect that some DAX queries load unnecessary columns into memory.

You need to identify the frequently used columns that are loaded into memory.

What are two ways to achieve the goal? Each correct answer presents a complete solution.

NOTE: Each correct answer is worth one point.

Options:

A.

Use the Analyze in Excel feature.

B.

Use the Vertipaq Analyzer tool.

C.

Query the $system.discovered_STORAGE_TABLE_COLUMN-iN_SEGMeNTS dynamic management view (DMV).

D.

Query the discover_hehory6Rant dynamic management view (DMV).

Buy Now
Questions 15

You need to recommend which type of fabric capacity SKU meets the data analytics requirements for the Research division. What should you recommend?

Options:

A.

EM

B.

F

C.

P

D.

A

Buy Now
Questions 16

Which workspace rote assignments should you recommend for ResearchReviewersGroupl and ResearchReviewersGroupZ? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

DP-600 Question 16

Options:

Buy Now
Questions 17

You need to recommend a solution to group the Research division workspaces.

What should you include in the recommendation? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

DP-600 Question 17

Options:

Buy Now
Questions 18

You need to migrate the Research division data for Productline2. The solution must meet the data preparation requirements. How should you complete the code? To answer, select the appropriate options in the answer area

NOTE: Each correct selection is worth one point.

DP-600 Question 18

Options:

Buy Now
Questions 19

You need to resolve the issue with the pricing group classification.

How should you complete the T-SQL statement? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

DP-600 Question 19

DP-600 Question 19

Options:

Buy Now
Questions 20

You need to implement the date dimension in the data store. The solution must meet the technical requirements.

What are two ways to achieve the goal? Each correct answer presents a complete solution.

NOTE: Each correct selection is worth one point.

Options:

A.

Populate the date dimension table by using a dataflow.

B.

Populate the date dimension table by using a Stored procedure activity in a pipeline.

C.

Populate the date dimension view by using T-SQL.

D.

Populate the date dimension table by using a Copy activity in a pipeline.

Buy Now
Exam Code: DP-600
Exam Name: Implementing Analytics Solutions Using Microsoft Fabric
Last Update: Dec 4, 2024
Questions: 101

PDF + Testing Engine

$70  $174.99

Testing Engine

$54  $134.99
buy now DP-600 testing engine

PDF (Q&A)

$46  $114.99
buy now DP-600 pdf