Labour Day Sale Limited Time 60% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: 713PS592

ARA-C01 SnowPro Advanced: Architect Certification Exam Questions and Answers

Questions 4

A healthcare company wants to share data with a medical institute. The institute is running a Standard edition of Snowflake; the healthcare company is running a Business Critical edition.

How can this data be shared?

Options:

A.

The healthcare company will need to change the institute’s Snowflake edition in the accounts panel.

B.

By default, sharing is supported from a Business Critical Snowflake edition to a Standard edition.

C.

Contact Snowflake and they will execute the share request for the healthcare company.

D.

Set the share_restriction parameter on the shared object to false.

Buy Now
Questions 5

Consider the following scenario where a masking policy is applied on the CREDICARDND column of the CREDITCARDINFO table. The masking policy definition Is as follows:

ARA-C01 Question 5

Sample data for the CREDITCARDINFO table is as follows:

NAME EXPIRYDATE CREDITCARDNO

JOHN DOE 2022-07-23 4321 5678 9012 1234

if the Snowflake system rotes have not been granted any additional roles, what will be the result?

Options:

A.

The sysadmin can see the CREDICARDND column data in clear text.

B.

The owner of the table will see the CREDICARDND column data in clear text.

C.

Anyone with the Pl_ANALYTICS role will see the last 4 characters of the CREDICARDND column data in dear text.

D.

Anyone with the Pl_ANALYTICS role will see the CREDICARDND column as*** 'MASKED* **'.

Buy Now
Questions 6

A company has a source system that provides JSON records for various loT operations. The JSON Is loading directly into a persistent table with a variant field. The data Is quickly growing to 100s of millions of records and performance to becoming an issue. There is a generic access pattern that Is used to filter on the create_date key within the variant field.

What can be done to improve performance?

Options:

A.

Alter the target table to Include additional fields pulled from the JSON records. This would Include a create_date field with a datatype of time stamp. When this field Is used in the filter, partition pruning will occur.

B.

Alter the target table to include additional fields pulled from the JSON records. This would include a create_date field with a datatype of varchar. When this field is used in the filter, partition pruning will occur.

C.

Validate the size of the warehouse being used. If the record count is approaching 100s of millions, size XL will be the minimum size required to process this amount of data.

D.

Incorporate the use of multiple tables partitioned by date ranges. When a user or process needs to query a particular date range, ensure the appropriate base table Is used.

Buy Now
Questions 7

An Architect needs to automate the daily Import of two files from an external stage into Snowflake. One file has Parquet-formatted data, the other has CSV-formatted data.

How should the data be joined and aggregated to produce a final result set?

Options:

A.

Use Snowpipe to ingest the two files, then create a materialized view to produce the final result set.

B.

Create a task using Snowflake scripting that will import the files, and then call a User-Defined Function (UDF) to produce the final result set.

C.

Create a JavaScript stored procedure to read. join, and aggregate the data directly from the external stage, and then store the results in a table.

D.

Create a materialized view to read, Join, and aggregate the data directly from the external stage, and use the view to produce the final result set

Buy Now
Questions 8

Which data models can be used when modeling tables in a Snowflake environment? (Select THREE).

Options:

A.

Graph model

B.

Dimensional/Kimball

C.

Data lake

D.

lnmon/3NF

E.

Bayesian hierarchical model

F.

Data vault

Buy Now
Questions 9

A Developer is having a performance issue with a Snowflake query. The query receives up to 10 different values for one parameter and then performs an aggregation over the majority of a fact table. It then

joins against a smaller dimension table. This parameter value is selected by the different query users when they execute it during business hours. Both the fact and dimension tables are loaded with new data in an overnight import process.

On a Small or Medium-sized virtual warehouse, the query performs slowly. Performance is acceptable on a size Large or bigger warehouse. However, there is no budget to increase costs. The Developer

needs a recommendation that does not increase compute costs to run this query.

What should the Architect recommend?

Options:

A.

Create a task that will run the 10 different variations of the query corresponding to the 10 different parameters before the users come in to work. The query results will then be cached and ready to respond quickly when the users re-issue the query.

B.

Create a task that will run the 10 different variations of the query corresponding to the 10 different parameters before the users come in to work. The task will be scheduled to align with the users' working hours in order to allow the warehouse cache to be used.

C.

Enable the search optimization service on the table. When the users execute the query, the search optimization service will automatically adjust the query execution plan based on the frequently-used parameters.

D.

Create a dedicated size Large warehouse for this particular set of queries. Create a new role that has USAGE permission on this warehouse and has the appropriate read permissions over the fact and dimension tables. Have users switch to this role and use this warehouse when they want to access this data.

Buy Now
Questions 10

A company has several sites in different regions from which the company wants to ingest data.

Which of the following will enable this type of data ingestion?

Options:

A.

The company must have a Snowflake account in each cloud region to be able to ingest data to that account.

B.

The company must replicate data between Snowflake accounts.

C.

The company should provision a reader account to each site and ingest the data through the reader accounts.

D.

The company should use a storage integration for the external stage.

Buy Now
Questions 11

An Architect is designing a file ingestion recovery solution. The project will use an internal named stage for file storage. Currently, in the case of an ingestion failure, the Operations team must manually download the failed file and check for errors.

Which downloading method should the Architect recommend that requires the LEAST amount of operational overhead?

Options:

A.

Use the Snowflake Connector for Python, connect to remote storage and download the file.

B.

Use the get command in SnowSQL to retrieve the file.

C.

Use the get command in Snowsight to retrieve the file.

D.

Use the Snowflake API endpoint and download the file.

Buy Now
Questions 12

An Architect is integrating an application that needs to read and write data to Snowflake without installing any additional software on the application server.

How can this requirement be met?

Options:

A.

Use SnowSQL.

B.

Use the Snowpipe REST API.

C.

Use the Snowflake SQL REST API.

D.

Use the Snowflake ODBC driver.

Buy Now
Questions 13

What considerations need to be taken when using database cloning as a tool for data lifecycle management in a development environment? (Select TWO).

Options:

A.

Any pipes in the source are not cloned.

B.

Any pipes in the source referring to internal stages are not cloned.

C.

Any pipes in the source referring to external stages are not cloned.

D.

The clone inherits all granted privileges of all child objects in the source object, including the database.

E.

The clone inherits all granted privileges of all child objects in the source object, excluding the database.

Buy Now
Questions 14

A table contains five columns and it has millions of records. The cardinality distribution of the columns is shown below:

ARA-C01 Question 14

Column C4 and C5 are mostly used by SELECT queries in the GROUP BY and ORDER BY clauses. Whereas columns C1, C2 and C3 are heavily used in filter and join conditions of SELECT queries.

The Architect must design a clustering key for this table to improve the query performance.

Based on Snowflake recommendations, how should the clustering key columns be ordered while defining the multi-column clustering key?

Options:

A.

C5, C4, C2

B.

C3, C4, C5

C.

C1, C3, C2

D.

C2, C1, C3

Buy Now
Questions 15

How can the Snowpipe REST API be used to keep a log of data load history?

Options:

A.

Call insertReport every 20 minutes, fetching the last 10,000 entries.

B.

Call loadHistoryScan every minute for the maximum time range.

C.

Call insertReport every 8 minutes for a 10-minute time range.

D.

Call loadHistoryScan every 10 minutes for a 15-minute time range.

Buy Now
Questions 16

A company has a Snowflake environment running in AWS us-west-2 (Oregon). The company needs to share data privately with a customer who is running their Snowflake environment in Azure East US 2 (Virginia).

What is the recommended sequence of operations that must be followed to meet this requirement?

Options:

A.

1. Create a share and add the database privileges to the share

2. Create a new listing on the Snowflake Marketplace

3. Alter the listing and add the share

4. Instruct the customer to subscribe to the listing on the Snowflake Marketplace

B.

1. Ask the customer to create a new Snowflake account in Azure EAST US 2 (Virginia)

2. Create a share and add the database privileges to the share

3. Alter the share and add the customer's Snowflake account to the share

C.

1. Create a new Snowflake account in Azure East US 2 (Virginia)

2. Set up replication between AWS us-west-2 (Oregon) and Azure East US 2 (Virginia) for the database objects to be shared

3. Create a share and add the database privileges to the share

4. Alter the share and add the customer's Snowflake account to the share

D.

1. Create a reader account in Azure East US 2 (Virginia)

2. Create a share and add the database privileges to the share

3. Add the reader account to the share

4. Share the reader account's URL and credentials with the customer

Buy Now
Questions 17

A healthcare company is deploying a Snowflake account that may include Personal Health Information (PHI). The company must ensure compliance with all relevant privacy standards.

Which best practice recommendations will meet data protection and compliance requirements? (Choose three.)

Options:

A.

Use, at minimum, the Business Critical edition of Snowflake.

B.

Create Dynamic Data Masking policies and apply them to columns that contain PHI.

C.

Use the Internal Tokenization feature to obfuscate sensitive data.

D.

Use the External Tokenization feature to obfuscate sensitive data.

E.

Rewrite SQL queries to eliminate projections of PHI data based on current_role().

F.

Avoid sharing data with partner organizations.

Buy Now
Questions 18

When loading data into a table that captures the load time in a column with a default value of either CURRENT_TIME () or CURRENT_TIMESTAMP () what will occur?

Options:

A.

All rows loaded using a specific COPY statement will have varying timestamps based on when the rows were inserted.

B.

Any rows loaded using a specific COPY statement will have varying timestamps based on when the rows were read from the source.

C.

Any rows loaded using a specific COPY statement will have varying timestamps based on when the rows were created in the source.

D.

All rows loaded using a specific COPY statement will have the same timestamp value.

Buy Now
Questions 19

When using the Snowflake Connector for Kafka, what data formats are supported for the messages? (Choose two.)

Options:

A.

CSV

B.

XML

C.

Avro

D.

JSON

E.

Parquet

Buy Now
Questions 20

There are two databases in an account, named fin_db and hr_db which contain payroll and employee data, respectively. Accountants and Analysts in the company require different permissions on the objects in these databases to perform their jobs. Accountants need read-write access to fin_db but only require read-only access to hr_db because the database is maintained by human resources personnel.

An Architect needs to create a read-only role for certain employees working in the human resources department.

Which permission sets must be granted to this role?

Options:

A.

USAGE on database hr_db, USAGE on all schemas in database hr_db, SELECT on all tables in database hr_db

B.

USAGE on database hr_db, SELECT on all schemas in database hr_db, SELECT on all tables in database hr_db

C.

MODIFY on database hr_db, USAGE on all schemas in database hr_db, USAGE on all tables in database hr_db

D.

USAGE on database hr_db, USAGE on all schemas in database hr_db, REFERENCES on all tables in database hr_db

Buy Now
Questions 21

Which feature provides the capability to define an alternate cluster key for a table with an existing cluster key?

Options:

A.

External table

B.

Materialized view

C.

Search optimization

D.

Result cache

Buy Now
Questions 22

What are purposes for creating a storage integration? (Choose three.)

Options:

A.

Control access to Snowflake data using a master encryption key that is maintained in the cloud provider’s key management service.

B.

Store a generated identity and access management (IAM) entity for an external cloud provider regardless of the cloud provider that hosts the Snowflake account.

C.

Support multiple external stages using one single Snowflake object.

D.

Avoid supplying credentials when creating a stage or when loading or unloading data.

E.

Create private VPC endpoints that allow direct, secure connectivity between VPCs without traversing the public internet.

F.

Manage credentials from multiple cloud providers in one single Snowflake object.

Buy Now
Questions 23

How can the Snowpipe REST API be used to keep a log of data load history?

Options:

A.

Call insertReport every 20 minutes, fetching the last 10,000 entries.

B.

Call loadHistoryScan every minute for the maximum time range.

C.

Call insertReport every 8 minutes for a 10-minute time range.

D.

Call loadHistoryScan every 10 minutes for a 15-minutes range.

Buy Now
Questions 24

Which query will identify the specific days and virtual warehouses that would benefit from a multi-cluster warehouse to improve the performance of a particular workload?

A)

ARA-C01 Question 24

B)

ARA-C01 Question 24

C)

ARA-C01 Question 24

D)

ARA-C01 Question 24

Options:

A.

Option A

B.

Option B

C.

Option C

D.

Option D

Buy Now
Questions 25

A company is trying to Ingest 10 TB of CSV data into a Snowflake table using Snowpipe as part of Its migration from a legacy database platform. The records need to be ingested in the MOST performant and cost-effective way.

How can these requirements be met?

Options:

A.

Use ON_ERROR = continue in the copy into command.

B.

Use purge = TRUE in the copy into command.

C.

Use FURGE = FALSE in the copy into command.

D.

Use on error = SKIP_FILE in the copy into command.

Buy Now
Questions 26

A company is designing high availability and disaster recovery plans and needs to maximize redundancy and minimize recovery time objectives for their critical application processes. Cost is not a concern as long as the solution is the best available. The plan so far consists of the following steps:

1. Deployment of Snowflake accounts on two different cloud providers.

2. Selection of cloud provider regions that are geographically far apart.

3. The Snowflake deployment will replicate the databases and account data between both cloud provider accounts.

4. Implementation of Snowflake client redirect.

What is the MOST cost-effective way to provide the HIGHEST uptime and LEAST application disruption if there is a service event?

Options:

A.

Connect the applications using the - URL. Use the Business Critical Snowflake edition.

B.

Connect the applications using the - URL. Use the Virtual Private Snowflake (VPS) edition.

C.

Connect the applications using the - URL. Use the Enterprise Snowflake edition.

D.

Connect the applications using the - URL. Use the Business Critical Snowflake edition.

Buy Now
Questions 27

At which object type level can the APPLY MASKING POLICY, APPLY ROW ACCESS POLICY and APPLY SESSION POLICY privileges be granted?

Options:

A.

Global

B.

Database

C.

Schema

D.

Table

Buy Now
Questions 28

An Architect has chosen to separate their Snowflake Production and QA environments using two separate Snowflake accounts.

The QA account is intended to run and test changes on data and database objects before pushing those changes to the Production account. It is a requirement that all database objects and data in the QA account need to be an exact copy of the database objects, including privileges and data in the Production account on at least a nightly basis.

Which is the LEAST complex approach to use to populate the QA account with the Production account’s data and database objects on a nightly basis?

Options:

A.

1) Create a share in the Production account for each database

2) Share access to the QA account as a Consumer

3) The QA account creates a database directly from each share

4) Create clones of those databases on a nightly basis

5) Run tests directly on those cloned databases

B.

1) Create a stage in the Production account

2) Create a stage in the QA account that points to the same external object-storage location

3) Create a task that runs nightly to unload each table in the Production account into the stage

4) Use Snowpipe to populate the QA account

C.

1) Enable replication for each database in the Production account

2) Create replica databases in the QA account

3) Create clones of the replica databases on a nightly basis

4) Run tests directly on those cloned databases

D.

1) In the Production account, create an external function that connects into the QA account and returns all the data for one specific table

2) Run the external function as part of a stored procedure that loops through each table in the Production account and populates each table in the QA account

Buy Now
Questions 29

A DevOps team has a requirement for recovery of staging tables used in a complex set of data pipelines. The staging tables are all located in the same staging schema. One of the requirements is to have online recovery of data on a rolling 7-day basis.

After setting up the DATA_RETENTION_TIME_IN_DAYS at the database level, certain tables remain unrecoverable past 1 day.

What would cause this to occur? (Choose two.)

Options:

A.

The staging schema has not been setup for MANAGED ACCESS.

B.

The DATA_RETENTION_TIME_IN_DAYS for the staging schema has been set to 1 day.

C.

The tables exceed the 1 TB limit for data recovery.

D.

The staging tables are of the TRANSIENT type.

E.

The DevOps role should be granted ALLOW_RECOVERY privilege on the staging schema.

Buy Now
Questions 30

A retail company has 2000+ stores spread across the country. Store Managers report that they are having trouble running key reports related to inventory management, sales targets, payroll, and staffing during business hours. The Managers report that performance is poor and time-outs occur frequently.

Currently all reports share the same Snowflake virtual warehouse.

How should this situation be addressed? (Select TWO).

Options:

A.

Use a Business Intelligence tool for in-memory computation to improve performance.

B.

Configure a dedicated virtual warehouse for the Store Manager team.

C.

Configure the virtual warehouse to be multi-clustered.

D.

Configure the virtual warehouse to size 4-XL

E.

Advise the Store Manager team to defer report execution to off-business hours.

Buy Now
Questions 31

Which statements describe characteristics of the use of materialized views in Snowflake? (Choose two.)

Options:

A.

They can include ORDER BY clauses.

B.

They cannot include nested subqueries.

C.

They can include context functions, such as CURRENT_TIME().

D.

They can support MIN and MAX aggregates.

E.

They can support inner joins, but not outer joins.

Buy Now
Questions 32

A company wants to Integrate its main enterprise identity provider with federated authentication with Snowflake.

The authentication integration has been configured and roles have been created in Snowflake. However, the users are not automatically appearing in Snowflake when created and their group membership is not reflected in their assigned rotes.

How can the missing functionality be enabled with the LEAST amount of operational overhead?

Options:

A.

OAuth must be configured between the identity provider and Snowflake. Then the authorization server must be configured with the right mapping of users and roles.

B.

OAuth must be configured between the identity provider and Snowflake. Then the authorization server must be configured with the right mapping of users, and the resource server must be configured with the right mapping of role assignment.

C.

SCIM must be enabled between the identity provider and Snowflake. Once both are synchronized through SCIM, their groups will get created as group accounts in Snowflake and the proper roles can be granted.

D.

SCIM must be enabled between the identity provider and Snowflake. Once both are synchronized through SCIM. users will automatically get created and their group membership will be reflected as roles In Snowflake.

Buy Now
Questions 33

Which Snowflake data modeling approach is designed for BI queries?

Options:

A.

3 NF

B.

Star schema

C.

Data Vault

D.

Snowflake schema

Buy Now
Questions 34

A company has an external vendor who puts data into Google Cloud Storage. The company's Snowflake account is set up in Azure.

What would be the MOST efficient way to load data from the vendor into Snowflake?

Options:

A.

Ask the vendor to create a Snowflake account, load the data into Snowflake and create a data share.

B.

Create an external stage on Google Cloud Storage and use the external table to load the data into Snowflake.

C.

Copy the data from Google Cloud Storage to Azure Blob storage using external tools and load data from Blob storage to Snowflake.

D.

Create a Snowflake Account in the Google Cloud Platform (GCP), ingest data into this account and use data replication to move the data from GCP to Azure.

Buy Now
Questions 35

An Architect is troubleshooting a query with poor performance using the QUERY_HIST0RY function. The Architect observes that the COMPILATIONJHME is greater than the EXECUTIONJTIME.

What is the reason for this?

Options:

A.

The query is processing a very large dataset.

B.

The query has overly complex logic.

C.

The query is queued for execution.

D.

The query is reading from remote storage.

Buy Now
Questions 36

What step will im the performance of queries executed against an external table?

Options:

A.

Partition the external table.

B.

Shorten the names of the source files.

C.

Convert the source files' character encoding to UTF-8.

D.

Use an internal stage instead of an external stage to store the source files.

Buy Now
Questions 37

A user, analyst_user has been granted the analyst_role, and is deploying a SnowSQL script to run as a background service to extract data from Snowflake.

What steps should be taken to allow the IP addresses to be accessed? (Select TWO).

Options:

A.

ALTERROLEANALYST_ROLESETNETWORK_POLICY='ANALYST_POLICY';

B.

ALTERUSERANALYSTJJSERSETNETWORK_POLICY='ANALYST_POLICY';

C.

ALTERUSERANALYST_USERSETNETWORK_POLICY='10.1.1.20';

D.

USE ROLE SECURITYADMIN;

CREATE OR REPLACE NETWORK POLICY ANALYST_POLICY ALLOWED_IP_LIST = ('10.1.1.20');

E.

USE ROLE USERADMIN;

CREATE OR REPLACE NETWORK POLICY ANALYST_POLICY

ALLOWED_IP_LIST = ('10.1.1.20');

Buy Now
Questions 38

When using the copy into

command with the CSV file format, how does the match_by_column_name parameter behave?

Options:

A.

It expects a header to be present in the CSV file, which is matched to a case-sensitive table column name.

B.

The parameter will be ignored.

C.

The command will return an error.

D.

The command will return a warning stating that the file has unmatched columns.

Buy Now
command is used to load data from staged files into an existing table in Snowflake. The command supports various file formats, such as CSV, JSON, AVRO, ORC, PARQUET, and XML1.
  • The match_by_column_name parameter is a copy option that enables loading semi-structured data into separate columns in the target table that match corresponding columns represented in the source data. The parameter can have one of the following values2:
  • The match_by_column_name parameter only applies to semi-structured data, such as JSON, AVRO, ORC, PARQUET, and XML. It does not apply to CSV data, which is considered structured data2.
  • When using the copy into
  • command with the CSV file format, the match_by_column_name parameter behaves as follows2:

    References:

    • 1: COPY INTO
    | Snowflake Documentation
  • 2: MATCH_BY_COLUMN_NAME | Snowflake Documentation
  • Questions 39

    In a managed access schema, what are characteristics of the roles that can manage object privileges? (Select TWO).

    Options:

    A.

    Users with the SYSADMIN role can grant object privileges in a managed access schema.

    B.

    Users with the SECURITYADMIN role or higher, can grant object privileges in a managed access schema.

    C.

    Users who are database owners can grant object privileges in a managed access schema.

    D.

    Users who are schema owners can grant object privileges in a managed access schema.

    E.

    Users who are object owners can grant object privileges in a managed access schema.

    Buy Now
    Questions 40

    A media company needs a data pipeline that will ingest customer review data into a Snowflake table, and apply some transformations. The company also needs to use Amazon Comprehend to do sentiment analysis and make the de-identified final data set available publicly for advertising companies who use different cloud providers in different regions.

    The data pipeline needs to run continuously and efficiently as new records arrive in the object storage leveraging event notifications. Also, the operational complexity, maintenance of the infrastructure, including platform upgrades and security, and the development effort should be minimal.

    Which design will meet these requirements?

    Options:

    A.

    Ingest the data using copy into and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

    B.

    Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Create an external function to do model inference with Amazon Comprehend and write the final records to a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

    C.

    Ingest the data into Snowflake using Amazon EMR and PySpark using the Snowflake Spark connector. Apply transformations using another Spark job. Develop a python program to do model inference by leveraging the Amazon Comprehend text analysis API. Then write the results to a Snowflake table and create a listing in the Snowflake Marketplace to make the data available to other companies.

    D.

    Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

    Buy Now
    Questions 41

    When loading data from stage using COPY INTO, what options can you specify for the ON_ERROR clause?

    Options:

    A.

    CONTINUE

    B.

    SKIP_FILE

    C.

    ABORT_STATEMENT

    D.

    FAIL

    Buy Now

    Questions 42

    A Snowflake Architect is setting up database replication to support a disaster recovery plan. The primary database has external tables.

    How should the database be replicated?

    Options:

    A.

    Create a clone of the primary database then replicate the database.

    B.

    Move the external tables to a database that is not replicated, then replicate the primary database.

    C.

    Replicate the database ensuring the replicated database is in the same region as the external tables.

    D.

    Share the primary database with an account in the same region that the database will be replicated to.

    Buy Now
    Questions 43

    You are a snowflake architect in an organization. The business team came to to deploy an use case which requires you to load some data which they can visualize through tableau. Everyday new data comes in and the old data is no longer required.

    What type of table you will use in this case to optimize cost

    Options:

    A.

    TRANSIENT

    B.

    TEMPORARY

    C.

    PERMANENT

    Buy Now
    Questions 44

    A company has an inbound share set up with eight tables and five secure views. The company plans to make the share part of its production data pipelines.

    Which actions can the company take with the inbound share? (Choose two.)

    Options:

    A.

    Clone a table from a share.

    B.

    Grant modify permissions on the share.

    C.

    Create a table from the shared database.

    D.

    Create additional views inside the shared database.

    E.

    Create a table stream on the shared table.

    Buy Now
    Questions 45

    An Architect is troubleshooting a query with poor performance using the QUERY function. The Architect observes that the COMPILATION_TIME Is greater than the EXECUTION_TIME.

    What is the reason for this?

    Options:

    A.

    The query is processing a very large dataset.

    B.

    The query has overly complex logic.

    C.

    The query Is queued for execution.

    D.

    The query Is reading from remote storage

    Buy Now
    Questions 46

    A company is following the Data Mesh principles, including domain separation, and chose one Snowflake account for its data platform.

    An Architect created two data domains to produce two data products. The Architect needs a third data domain that will use both of the data products to create an aggregate data product. The read access to the data products will be granted through a separate role.

    Based on the Data Mesh principles, how should the third domain be configured to create the aggregate product if it has been granted the two read roles?

    Options:

    A.

    Use secondary roles for all users.

    B.

    Create a hierarchy between the two read roles.

    C.

    Request a technical ETL user with the sysadmin role.

    D.

    Request that the two data domains share data using the Data Exchange.

    Buy Now
    Questions 47

    When loading data into a table that captures the load time in a column with a default value of either CURRENT_TIME () or CURRENT_TIMESTAMP() what will occur?

    Options:

    A.

    All rows loaded using a specific COPY statement will have varying timestamps based on when the rows were inserted.

    B.

    Any rows loaded using a specific COPY statement will have varying timestamps based on when the rows were read from the source.

    C.

    Any rows loaded using a specific COPY statement will have varying timestamps based on when the rows were created in the source.

    D.

    All rows loaded using a specific COPY statement will have the same timestamp value.

    Buy Now
    Questions 48

    Files arrive in an external stage every 10 seconds from a proprietary system. The files range in size from 500 K to 3 MB. The data must be accessible by dashboards as soon as it arrives.

    How can a Snowflake Architect meet this requirement with the LEAST amount of coding? (Choose two.)

    Options:

    A.

    Use Snowpipe with auto-ingest.

    B.

    Use a COPY command with a task.

    C.

    Use a materialized view on an external table.

    D.

    Use the COPY INTO command.

    E.

    Use a combination of a task and a stream.

    Buy Now
    Exam Code: ARA-C01
    Exam Name: SnowPro Advanced: Architect Certification Exam
    Last Update: Apr 27, 2024
    Questions: 162

    PDF + Testing Engine

    $66.4  $165.99

    Testing Engine

    $46  $114.99
    buy now ARA-C01 testing engine

    PDF (Q&A)

    $42  $104.99
    buy now ARA-C01 pdf