Black Friday Sale - Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: 65percent

Welcome To DumpsPedia

ARA-R01 Sample Questions Answers

Questions 4

A media company needs a data pipeline that will ingest customer review data into a Snowflake table, and apply some transformations. The company also needs to use Amazon Comprehend to do sentiment analysis and make the de-identified final data set available publicly for advertising companies who use different cloud providers in different regions.

The data pipeline needs to run continuously and efficiently as new records arrive in the object storage leveraging event notifications. Also, the operational complexity, maintenance of the infrastructure, including platform upgrades and security, and the development effort should be minimal.

Which design will meet these requirements?

Options:

A.

Ingest the data using copy into and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

B.

Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Create an external function to do model inference with Amazon Comprehend and write the final records to a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

C.

Ingest the data into Snowflake using Amazon EMR and PySpark using the Snowflake Spark connector. Apply transformations using another Spark job. Develop a python program to do model inference by leveraging the Amazon Comprehend text analysis API. Then write the results to a Snowflake table and create a listing in the Snowflake Marketplace to make the data available to other companies.

D.

Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

Buy Now
Questions 5

An Architect needs to meet a company requirement to ingest files from the company's AWS storage accounts into the company's Snowflake Google Cloud Platform (GCP) account. How can the ingestion of these files into the company's Snowflake account be initiated? (Select TWO).

Options:

A.

Configure the client application to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 storage.

B.

Configure the client application to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 Glacier storage.

C.

Create an AWS Lambda function to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 storage.

D.

Configure AWS Simple Notification Service (SNS) to notify Snowpipe when new files have arrived in Amazon S3 storage.

E.

Configure the client application to issue a COPY INTO

command to Snowflake when new files have arrived in Amazon S3 Glacier storage.

Buy Now
command to Snowflake when new files have arrived in Amazon S3 Glacier storage. This option is not relevant because it does not use Snowpipe, but rather the standard COPY command, which is a batch loading method. Moreover, the COPY command also does not support ingesting files from Amazon S3 Glacier storage7 References:
  • 1: SnowPro Advanced: Architect | Study Guide 8
  • 2: Snowflake Documentation | Snowpipe Overview 9
  • 3: Snowflake Documentation | Using the Snowpipe REST API 10
  • 4: Snowflake Documentation | Loading Data Using Snowpipe and AWS Lambda 11
  • 5: Snowflake Documentation | Supported File Formats and Compression for Staged Data Files 12
  • 6: Snowflake Documentation | Using Cloud Notifications to Trigger Snowpipe 13
  • 7: Snowflake Documentation | Loading Data Using COPY into a Table
  • : SnowPro Advanced: Architect | Study Guide
  • : Snowpipe Overview
  • : Using the Snowpipe REST API
  • : Loading Data Using Snowpipe and AWS Lambda
  • : Supported File Formats and Compression for Staged Data Files
  • : Using Cloud Notifications to Trigger Snowpipe
  • : Loading Data Using COPY into a Table
  • Questions 6

    A company is trying to Ingest 10 TB of CSV data into a Snowflake table using Snowpipe as part of Its migration from a legacy database platform. The records need to be ingested in the MOST performant and cost-effective way.

    How can these requirements be met?

    Options:

    A.

    Use ON_ERROR = continue in the copy into command.

    B.

    Use purge = TRUE in the copy into command.

    C.

    Use FURGE = FALSE in the copy into command.

    D.

    Use on error = SKIP_FILE in the copy into command.

    Buy Now
    Questions 7

    What does a Snowflake Architect need to consider when implementing a Snowflake Connector for Kafka?

    Options:

    A.

    Every Kafka message is in JSON or Avro format.

    B.

    The default retention time for Kafka topics is 14 days.

    C.

    The Kafka connector supports key pair authentication, OAUTH. and basic authentication (for example, username and password).

    D.

    The Kafka connector will create one table and one pipe to ingest data for each topic. If the connector cannot create the table or the pipe it will result in an exception.

    Buy Now
    Questions 8

    Which query will identify the specific days and virtual warehouses that would benefit from a multi-cluster warehouse to improve the performance of a particular workload?

    A)

    B)

    C)

    D)

    Options:

    A.

    Option A

    B.

    Option B

    C.

    Option C

    D.

    Option D

    Buy Now
    Questions 9

    The following DDL command was used to create a task based on a stream:

    Assuming MY_WH is set to auto_suspend – 60 and used exclusively for this task, which statement is true?

    Options:

    A.

    The warehouse MY_WH will be made active every five minutes to check the stream.

    B.

    The warehouse MY_WH will only be active when there are results in the stream.

    C.

    The warehouse MY_WH will never suspend.

    D.

    The warehouse MY_WH will automatically resize to accommodate the size of the stream.

    Buy Now
    Questions 10

    An Architect needs to automate the daily Import of two files from an external stage into Snowflake. One file has Parquet-formatted data, the other has CSV-formatted data.

    How should the data be joined and aggregated to produce a final result set?

    Options:

    A.

    Use Snowpipe to ingest the two files, then create a materialized view to produce the final result set.

    B.

    Create a task using Snowflake scripting that will import the files, and then call a User-Defined Function (UDF) to produce the final result set.

    C.

    Create a JavaScript stored procedure to read. join, and aggregate the data directly from the external stage, and then store the results in a table.

    D.

    Create a materialized view to read, Join, and aggregate the data directly from the external stage, and use the view to produce the final result set

    Buy Now
    Questions 11

    A company has a Snowflake account named ACCOUNTA in AWS us-east-1 region. The company stores its marketing data in a Snowflake database named MARKET_DB. One of the company’s business partners has an account named PARTNERB in Azure East US 2 region. For marketing purposes the company has agreed to share the database MARKET_DB with the partner account.

    Which of the following steps MUST be performed for the account PARTNERB to consume data from the MARKET_DB database?

    Options:

    A.

    Create a new account (called AZABC123) in Azure East US 2 region. From account ACCOUNTA create a share of database MARKET_DB, create a new database out of this share locally in AWS us-east-1 region, and replicate this new database to AZABC123 account. Then set up data sharing to the PARTNERB account.

    B.

    From account ACCOUNTA create a share of database MARKET_DB, and create a new database out of this share locally in AWS us-east-1 region. Then make this database the provider and share it with the PARTNERB account.

    C.

    Create a new account (called AZABC123) in Azure East US 2 region. From account ACCOUNTA replicate the database MARKET_DB to AZABC123 and from this account set up the data sharing to the PARTNERB account.

    D.

    Create a share of database MARKET_DB, and create a new database out of this share locally in AWS us-east-1 region. Then replicate this database to the partner’s account PARTNERB.

    Buy Now
    Questions 12

    An Architect on a new project has been asked to design an architecture that meets Snowflake security, compliance, and governance requirements as follows:

    1) Use Tri-Secret Secure in Snowflake

    2) Share some information stored in a view with another Snowflake customer

    3) Hide portions of sensitive information from some columns

    4) Use zero-copy cloning to refresh the non-production environment from the production environment

    To meet these requirements, which design elements must be implemented? (Choose three.)

    Options:

    A.

    Define row access policies.

    B.

    Use the Business-Critical edition of Snowflake.

    C.

    Create a secure view.

    D.

    Use the Enterprise edition of Snowflake.

    E.

    Use Dynamic Data Masking.

    F.

    Create a materialized view.

    Buy Now
    Questions 13

    What is the MOST efficient way to design an environment where data retention is not considered critical, and customization needs are to be kept to a minimum?

    Options:

    A.

    Use a transient database.

    B.

    Use a transient schema.

    C.

    Use a transient table.

    D.

    Use a temporary table.

    Buy Now
    Questions 14

    A company is using a Snowflake account in Azure. The account has SAML SSO set up using ADFS as a SCIM identity provider. To validate Private Link connectivity, an Architect performed the following steps:

    * Confirmed Private Link URLs are working by logging in with a username/password account

    * Verified DNS resolution by running nslookups against Private Link URLs

    * Validated connectivity using SnowCD

    * Disabled public access using a network policy set to use the company’s IP address range

    However, the following error message is received when using SSO to log into the company account:

    IP XX.XXX.XX.XX is not allowed to access snowflake. Contact your local security administrator.

    What steps should the Architect take to resolve this error and ensure that the account is accessed using only Private Link? (Choose two.)

    Options:

    A.

    Alter the Azure security integration to use the Private Link URLs.

    B.

    Add the IP address in the error message to the allowed list in the network policy.

    C.

    Generate a new SCIM access token using system$generate_scim_access_token and save it to Azure AD.

    D.

    Update the configuration of the Azure AD SSO to use the Private Link URLs.

    E.

    Open a case with Snowflake Support to authorize the Private Link URLs’ access to the account.

    Buy Now
    Questions 15

    When loading data into a table that captures the load time in a column with a default value of either CURRENT_TIME () or CURRENT_TIMESTAMP () what will occur?

    Options:

    A.

    All rows loaded using a specific COPY statement will have varying timestamps based on when the rows were inserted.

    B.

    Any rows loaded using a specific COPY statement will have varying timestamps based on when the rows were read from the source.

    C.

    Any rows loaded using a specific COPY statement will have varying timestamps based on when the rows were created in the source.

    D.

    All rows loaded using a specific COPY statement will have the same timestamp value.

    Buy Now
    Questions 16

    A user is executing the following command sequentially within a timeframe of 10 minutes from start to finish:

    What would be the output of this query?

    Options:

    A.

    Table T_SALES_CLONE successfully created.

    B.

    Time Travel data is not available for table T_SALES.

    C.

    The offset -> is not a valid clause in the clone operation.

    D.

    Syntax error line 1 at position 58 unexpected 'at’.

    Buy Now
    Questions 17

    Company A has recently acquired company B. The Snowflake deployment for company B is located in the Azure West Europe region.

    As part of the integration process, an Architect has been asked to consolidate company B's sales data into company A's Snowflake account which is located in the AWS us-east-1 region.

    How can this requirement be met?

    Options:

    A.

    Replicate the sales data from company B's Snowflake account into company A's Snowflake account using cross-region data replication within Snowflake. Configure a direct share from company B's account to company A's account.

    B.

    Export the sales data from company B's Snowflake account as CSV files, and transfer the files to company A's Snowflake account. Import the data using Snowflake's data loading capabilities.

    C.

    Migrate company B's Snowflake deployment to the same region as company A's Snowflake deployment, ensuring data locality. Then perform a direct database-to-database merge of the sales data.

    D.

    Build a custom data pipeline using Azure Data Factory or a similar tool to extract the sales data from company B's Snowflake account. Transform the data, then load it into company A's Snowflake account.

    Buy Now
    Questions 18

    Files arrive in an external stage every 10 seconds from a proprietary system. The files range in size from 500 K to 3 MB. The data must be accessible by dashboards as soon as it arrives.

    How can a Snowflake Architect meet this requirement with the LEAST amount of coding? (Choose two.)

    Options:

    A.

    Use Snowpipe with auto-ingest.

    B.

    Use a COPY command with a task.

    C.

    Use a materialized view on an external table.

    D.

    Use the COPY INTO command.

    E.

    Use a combination of a task and a stream.

    Buy Now
    Questions 19

    How is the change of local time due to daylight savings time handled in Snowflake tasks? (Choose two.)

    Options:

    A.

    A task scheduled in a UTC-based schedule will have no issues with the time changes.

    B.

    Task schedules can be designed to follow specified or local time zones to accommodate the time changes.

    C.

    A task will move to a suspended state during the daylight savings time change.

    D.

    A frequent task execution schedule like minutes may not cause a problem, but will affect the task history.

    E.

    A task schedule will follow only the specified time and will fail to handle lost or duplicated hours.

    Buy Now
    Questions 20

    At which object type level can the APPLY MASKING POLICY, APPLY ROW ACCESS POLICY and APPLY SESSION POLICY privileges be granted?

    Options:

    A.

    Global

    B.

    Database

    C.

    Schema

    D.

    Table

    Buy Now
    Questions 21

    An Architect for a multi-national transportation company has a system that is used to check the weather conditions along vehicle routes. The data is provided to drivers.

    The weather information is delivered regularly by a third-party company and this information is generated as JSON structure. Then the data is loaded into Snowflake in a column with a VARIANT data type. This

    table is directly queried to deliver the statistics to the drivers with minimum time lapse.

    A single entry includes (but is not limited to):

    - Weather condition; cloudy, sunny, rainy, etc.

    - Degree

    - Longitude and latitude

    - Timeframe

    - Location address

    - Wind

    The table holds more than 10 years' worth of data in order to deliver the statistics from different years and locations. The amount of data on the table increases every day.

    The drivers report that they are not receiving the weather statistics for their locations in time.

    What can the Architect do to deliver the statistics to the drivers faster?

    Options:

    A.

    Create an additional table in the schema for longitude and latitude. Determine a regular task to fill this information by extracting it from the JSON dataset.

    B.

    Add search optimization service on the variant column for longitude and latitude in order to query the information by using specific metadata.

    C.

    Divide the table into several tables for each year by using the timeframe information from the JSON dataset in order to process the queries in parallel.

    D.

    Divide the table into several tables for each location by using the location address information from the JSON dataset in order to process the queries in parallel.

    Buy Now
    Questions 22

    An Architect is designing a pipeline to stream event data into Snowflake using the Snowflake Kafka connector. The Architect’s highest priority is to configure the connector to stream data in the MOST cost-effective manner.

    Which of the following is recommended for optimizing the cost associated with the Snowflake Kafka connector?

    Options:

    A.

    Utilize a higher Buffer.flush.time in the connector configuration.

    B.

    Utilize a higher Buffer.size.bytes in the connector configuration.

    C.

    Utilize a lower Buffer.size.bytes in the connector configuration.

    D.

    Utilize a lower Buffer.count.records in the connector configuration.

    Buy Now
    Questions 23

    An Architect entered the following commands in sequence:

    USER1 cannot find the table.

    Which of the following commands does the Architect need to run for USER1 to find the tables using the Principle of Least Privilege? (Choose two.)

    Options:

    A.

    GRANT ROLE PUBLIC TO ROLE INTERN;

    B.

    GRANT USAGE ON DATABASE SANDBOX TO ROLE INTERN;

    C.

    GRANT USAGE ON SCHEMA SANDBOX.PUBLIC TO ROLE INTERN;

    D.

    GRANT OWNERSHIP ON DATABASE SANDBOX TO USER INTERN;

    E.

    GRANT ALL PRIVILEGES ON DATABASE SANDBOX TO ROLE INTERN;

    Buy Now
    Questions 24

    Which data models can be used when modeling tables in a Snowflake environment? (Select THREE).

    Options:

    A.

    Graph model

    B.

    Dimensional/Kimball

    C.

    Data lake

    D.

    lnmon/3NF

    E.

    Bayesian hierarchical model

    F.

    Data vault

    Buy Now
    Questions 25

    What transformations are supported in the below SQL statement? (Select THREE).

    CREATE PIPE ... AS COPY ... FROM (...)

    Options:

    A.

    Data can be filtered by an optional where clause.

    B.

    Columns can be reordered.

    C.

    Columns can be omitted.

    D.

    Type casts are supported.

    E.

    Incoming data can be joined with other tables.

    F.

    The ON ERROR - ABORT statement command can be used.

    Buy Now
    statement used by Snowpipe to load data from an ingestion queue into tables1. The statement uses a subquery in the FROM clause to transform the data from the staged files before loading it into the table2.
  • The transformations supported in the subquery are as follows2:
  • SQLAI-generated code. Review and use carefully. More info on FAQ.

    create pipe mypipe as

    copy into mytable

    from (

    select * from @mystage

    where col1 = 'A' and col2 > 10

    );

    • uk.co.certification.simulator.questionpool.PList@23173e90

    SQLAI-generated code. Review and use carefully. More info on FAQ.

    create pipe mypipe as

    copy into mytable (col1, col2, col3)

    from (

    select col3, col1, col2 from @mystage

    );

    • uk.co.certification.simulator.questionpool.PList@2332caf0

    SQLAI-generated code. Review and use carefully. More info on FAQ.

    create pipe mypipe as

    copy into mytable (col1, col2)

    from (

    select col1, col2 from @mystage

    );

    • The other options are not supported in the subquery because2:

    SQLAI-generated code. Review and use carefully. More info on FAQ.

    create pipe mypipe as

    copy into mytable (col1, col2)

    from (

    select col1::date, col2 from @mystage

    );

    • uk.co.certification.simulator.questionpool.PList@2332ccb0

    SQLAI-generated code. Review and use carefully. More info on FAQ.

    create pipe mypipe as

    copy into mytable (col1, col2, col3)

    from (

    select s.col1, s.col2, t.col3 from @mystage s

    join othertable t on s.col1 = t.col1

    );

    • uk.co.certification.simulator.questionpool.PList@2332cdd0

    SQLAI-generated code. Review and use carefully. More info on FAQ.

    create pipe mypipe as

    copy into mytable

    from (

    select * from @mystage

    on error abort

    );

    References:

    • 1: CREATE PIPE | Snowflake Documentation
    • 2: Transforming Data During a Load | Snowflake Documentation

    Questions 26

    An Architect has chosen to separate their Snowflake Production and QA environments using two separate Snowflake accounts.

    The QA account is intended to run and test changes on data and database objects before pushing those changes to the Production account. It is a requirement that all database objects and data in the QA account need to be an exact copy of the database objects, including privileges and data in the Production account on at least a nightly basis.

    Which is the LEAST complex approach to use to populate the QA account with the Production account’s data and database objects on a nightly basis?

    Options:

    A.

    1) Create a share in the Production account for each database

    2) Share access to the QA account as a Consumer

    3) The QA account creates a database directly from each share

    4) Create clones of those databases on a nightly basis

    5) Run tests directly on those cloned databases

    B.

    1) Create a stage in the Production account

    2) Create a stage in the QA account that points to the same external object-storage location

    3) Create a task that runs nightly to unload each table in the Production account into the stage

    4) Use Snowpipe to populate the QA account

    C.

    1) Enable replication for each database in the Production account

    2) Create replica databases in the QA account

    3) Create clones of the replica databases on a nightly basis

    4) Run tests directly on those cloned databases

    D.

    1) In the Production account, create an external function that connects into the QA account and returns all the data for one specific table

    2) Run the external function as part of a stored procedure that loops through each table in the Production account and populates each table in the QA account

    Buy Now
    Questions 27

    A large manufacturing company runs a dozen individual Snowflake accounts across its business divisions. The company wants to increase the level of data sharing to support supply chain optimizations and increase its purchasing leverage with multiple vendors.

    The company’s Snowflake Architects need to design a solution that would allow the business divisions to decide what to share, while minimizing the level of effort spent on configuration and management. Most of the company divisions use Snowflake accounts in the same cloud deployments with a few exceptions for European-based divisions.

    According to Snowflake recommended best practice, how should these requirements be met?

    Options:

    A.

    Migrate the European accounts in the global region and manage shares in a connected graph architecture. Deploy a Data Exchange.

    B.

    Deploy a Private Data Exchange in combination with data shares for the European accounts.

    C.

    Deploy to the Snowflake Marketplace making sure that invoker_share() is used in all secure views.

    D.

    Deploy a Private Data Exchange and use replication to allow European data shares in the Exchange.

    Buy Now
    Questions 28

    How can the Snowflake context functions be used to help determine whether a user is authorized to see data that has column-level security enforced? (Select TWO).

    Options:

    A.

    Set masking policy conditions using current_role targeting the role in use for the current session.

    B.

    Set masking policy conditions using is_role_in_session targeting the role in use for the current account.

    C.

    Set masking policy conditions using invoker_role targeting the executing role in a SQL statement.

    D.

    Determine if there are ownership privileges on the masking policy that would allow the use of any function.

    E.

    Assign the accountadmin role to the user who is executing the object.

    Buy Now
    Questions 29

    When using the Snowflake Connector for Kafka, what data formats are supported for the messages? (Choose two.)

    Options:

    A.

    CSV

    B.

    XML

    C.

    Avro

    D.

    JSON

    E.

    Parquet

    Buy Now
    Questions 30

    What built-in Snowflake features make use of the change tracking metadata for a table? (Choose two.)

    Options:

    A.

    The MERGE command

    B.

    The UPSERT command

    C.

    The CHANGES clause

    D.

    A STREAM object

    E.

    The CHANGE_DATA_CAPTURE command

    Buy Now
    Questions 31

    A Developer is having a performance issue with a Snowflake query. The query receives up to 10 different values for one parameter and then performs an aggregation over the majority of a fact table. It then

    joins against a smaller dimension table. This parameter value is selected by the different query users when they execute it during business hours. Both the fact and dimension tables are loaded with new data in an overnight import process.

    On a Small or Medium-sized virtual warehouse, the query performs slowly. Performance is acceptable on a size Large or bigger warehouse. However, there is no budget to increase costs. The Developer

    needs a recommendation that does not increase compute costs to run this query.

    What should the Architect recommend?

    Options:

    A.

    Create a task that will run the 10 different variations of the query corresponding to the 10 different parameters before the users come in to work. The query results will then be cached and ready to respond quickly when the users re-issue the query.

    B.

    Create a task that will run the 10 different variations of the query corresponding to the 10 different parameters before the users come in to work. The task will be scheduled to align with the users' working hours in order to allow the warehouse cache to be used.

    C.

    Enable the search optimization service on the table. When the users execute the query, the search optimization service will automatically adjust the query execution plan based on the frequently-used parameters.

    D.

    Create a dedicated size Large warehouse for this particular set of queries. Create a new role that has USAGE permission on this warehouse and has the appropriate read permissions over the fact and dimension tables. Have users switch to this role and use this warehouse when they want to access this data.

    Buy Now
    Questions 32

    What Snowflake features should be leveraged when modeling using Data Vault?

    Options:

    A.

    Snowflake’s support of multi-table inserts into the data model’s Data Vault tables

    B.

    Data needs to be pre-partitioned to obtain a superior data access performance

    C.

    Scaling up the virtual warehouses will support parallel processing of new source loads

    D.

    Snowflake’s ability to hash keys so that hash key joins can run faster than integer joins

    Buy Now
    Questions 33

    A company needs to have the following features available in its Snowflake account:

    1. Support for Multi-Factor Authentication (MFA)

    2. A minimum of 2 months of Time Travel availability

    3. Database replication in between different regions

    4. Native support for JDBC and ODBC

    5. Customer-managed encryption keys using Tri-Secret Secure

    6. Support for Payment Card Industry Data Security Standards (PCI DSS)

    In order to provide all the listed services, what is the MINIMUM Snowflake edition that should be selected during account creation?

    Options:

    A.

    Standard

    B.

    Enterprise

    C.

    Business Critical

    D.

    Virtual Private Snowflake (VPS)

    Buy Now
    Questions 34

    The Data Engineering team at a large manufacturing company needs to engineer data coming from many sources to support a wide variety of use cases and data consumer requirements which include:

    1) Finance and Vendor Management team members who require reporting and visualization

    2) Data Science team members who require access to raw data for ML model development

    3) Sales team members who require engineered and protected data for data monetization

    What Snowflake data modeling approaches will meet these requirements? (Choose two.)

    Options:

    A.

    Consolidate data in the company’s data lake and use EXTERNAL TABLES.

    B.

    Create a raw database for landing and persisting raw data entering the data pipelines.

    C.

    Create a set of profile-specific databases that aligns data with usage patterns.

    D.

    Create a single star schema in a single database to support all consumers’ requirements.

    E.

    Create a Data Vault as the sole data pipeline endpoint and have all consumers directly access the Vault.

    Buy Now
    Questions 35

    Database DB1 has schema S1 which has one table, T1.

    DB1 --> S1 --> T1

    The retention period of EG1 is set to 10 days.

    The retention period of s: is set to 20 days.

    The retention period of t: Is set to 30 days.

    The user runs the following command:

    Drop Database DB1;

    What will the Time Travel retention period be for T1?

    Options:

    A.

    10 days

    B.

    20 days

    C.

    30 days

    D.

    37 days

    Buy Now
    Questions 36

    A user has activated primary and secondary roles for a session.

    What operation is the user prohibited from using as part of SQL actions in Snowflake using the secondary role?

    Options:

    A.

    Insert

    B.

    Create

    C.

    Delete

    D.

    Truncate

    Buy Now
    Questions 37

    Which system functions does Snowflake provide to monitor clustering information within a table (Choose two.)

    Options:

    A.

    SYSTEM$CLUSTERING_INFORMATION

    B.

    SYSTEM$CLUSTERING_USAGE

    C.

    SYSTEM$CLUSTERING_DEPTH

    D.

    SYSTEM$CLUSTERING_KEYS

    E.

    SYSTEM$CLUSTERING_PERCENT

    Buy Now
    Questions 38

    Which technique will efficiently ingest and consume semi-structured data for Snowflake data lake workloads?

    Options:

    A.

    IDEF1X

    B.

    Schema-on-write

    C.

    Schema-on-read

    D.

    Information schema

    Buy Now
    Questions 39

    A company wants to deploy its Snowflake accounts inside its corporate network with no visibility on the internet. The company is using a VPN infrastructure and Virtual Desktop Infrastructure (VDI) for its Snowflake users. The company also wants to re-use the login credentials set up for the VDI to eliminate redundancy when managing logins.

    What Snowflake functionality should be used to meet these requirements? (Choose two.)

    Options:

    A.

    Set up replication to allow users to connect from outside the company VPN.

    B.

    Provision a unique company Tri-Secret Secure key.

    C.

    Use private connectivity from a cloud provider.

    D.

    Set up SSO for federated authentication.

    E.

    Use a proxy Snowflake account outside the VPN, enabling client redirect for user logins.

    Buy Now
    Questions 40

    How do Snowflake databases that are created from shares differ from standard databases that are not created from shares? (Choose three.)

    Options:

    A.

    Shared databases are read-only.

    B.

    Shared databases must be refreshed in order for new data to be visible.

    C.

    Shared databases cannot be cloned.

    D.

    Shared databases are not supported by Time Travel.

    E.

    Shared databases will have the PUBLIC or INFORMATION_SCHEMA schemas without explicitly granting these schemas to the share.

    F.

    Shared databases can also be created as transient databases.

    Buy Now
    Questions 41

    Which Snowflake architecture recommendation needs multiple Snowflake accounts for implementation?

    Options:

    A.

    Enable a disaster recovery strategy across multiple cloud providers.

    B.

    Create external stages pointing to cloud providers and regions other than the region hosting the Snowflake account.

    C.

    Enable zero-copy cloning among the development, test, and production environments.

    D.

    Enable separation of the development, test, and production environments.

    Buy Now
    Questions 42

    A company wants to Integrate its main enterprise identity provider with federated authentication with Snowflake.

    The authentication integration has been configured and roles have been created in Snowflake. However, the users are not automatically appearing in Snowflake when created and their group membership is not reflected in their assigned rotes.

    How can the missing functionality be enabled with the LEAST amount of operational overhead?

    Options:

    A.

    OAuth must be configured between the identity provider and Snowflake. Then the authorization server must be configured with the right mapping of users and roles.

    B.

    OAuth must be configured between the identity provider and Snowflake. Then the authorization server must be configured with the right mapping of users, and the resource server must be configured with the right mapping of role assignment.

    C.

    SCIM must be enabled between the identity provider and Snowflake. Once both are synchronized through SCIM, their groups will get created as group accounts in Snowflake and the proper roles can be granted.

    D.

    SCIM must be enabled between the identity provider and Snowflake. Once both are synchronized through SCIM. users will automatically get created and their group membership will be reflected as roles In Snowflake.

    Buy Now
    Questions 43

    An Architect needs to design a data unloading strategy for Snowflake, that will be used with the COPY INTO command.

    Which configuration is valid?

    Options:

    A.

    Location of files: Snowflake internal location

    . File formats: CSV, XML

    . File encoding: UTF-8

    . Encryption: 128-bit

    B.

    Location of files: Amazon S3

    . File formats: CSV, JSON

    . File encoding: Latin-1 (ISO-8859)

    . Encryption: 128-bit

    C.

    Location of files: Google Cloud Storage

    . File formats: Parquet

    . File encoding: UTF-8

    · Compression: gzip

    D.

    Location of files: Azure ADLS

    . File formats: JSON, XML, Avro, Parquet, ORC

    . Compression: bzip2

    . Encryption: User-supplied key

    Buy Now
    Questions 44

    Role A has the following permissions:

    . USAGE on db1

    . USAGE and CREATE VIEW on schemal in db1

    . SELECT on tablel in schemal

    Role B has the following permissions:

    . USAGE on db2

    . USAGE and CREATE VIEW on schema2 in db2

    . SELECT on table2 in schema2

    A user has Role A set as the primary role and Role B as a secondary role.

    What command will fail for this user?

    Options:

    A.

    use database db1;

    use schema schemal;

    create view v1 as select * from db2.schema2.table2;

    B.

    use database db2;

    use schema schema2;

    create view v2 as select * from dbl.schemal. tablel;

    C.

    use database db2;

    use schema schema2;

    select * from db1.schemal.tablel union select * from table2;

    D.

    use database db1;

    use schema schemal;

    select * from db2.schema2.table2;

    Buy Now
    Questions 45

    A retailer's enterprise data organization is exploring the use of Data Vault 2.0 to model its data lake solution. A Snowflake Architect has been asked to provide recommendations for using Data Vault 2.0 on Snowflake.

    What should the Architect tell the data organization? (Select TWO).

    Options:

    A.

    Change data capture can be performed using the Data Vault 2.0 HASH_DIFF concept.

    B.

    Change data capture can be performed using the Data Vault 2.0 HASH_DELTA concept.

    C.

    Using the multi-table insert feature in Snowflake, multiple Point-in-Time (PIT) tables can be loaded in parallel from a single join query from the data vault.

    D.

    Using the multi-table insert feature, multiple Point-in-Time (PIT) tables can be loaded sequentially from a single join query from the data vault.

    E.

    There are performance challenges when using Snowflake to load multiple Point-in-Time (PIT) tables in parallel from a single join query from the data vault.

    Buy Now
    Questions 46

    How does a standard virtual warehouse policy work in Snowflake?

    Options:

    A.

    It conserves credits by keeping running clusters fully loaded rather than starting additional clusters.

    B.

    It starts only if the system estimates that there is a query load that will keep the cluster busy for at least 6 minutes.

    C.

    It starts only f the system estimates that there is a query load that will keep the cluster busy for at least 2 minutes.

    D.

    It prevents or minimizes queuing by starting additional clusters instead of conserving credits.

    Buy Now
    Questions 47

    A user, analyst_user has been granted the analyst_role, and is deploying a SnowSQL script to run as a background service to extract data from Snowflake.

    What steps should be taken to allow the IP addresses to be accessed? (Select TWO).

    Options:

    A.

    ALTERROLEANALYST_ROLESETNETWORK_POLICY='ANALYST_POLICY';

    B.

    ALTERUSERANALYSTJJSERSETNETWORK_POLICY='ANALYST_POLICY';

    C.

    ALTERUSERANALYST_USERSETNETWORK_POLICY='10.1.1.20';

    D.

    USE ROLE SECURITYADMIN;

    CREATE OR REPLACE NETWORK POLICY ANALYST_POLICY ALLOWED_IP_LIST = ('10.1.1.20');

    E.

    USE ROLE USERADMIN;

    CREATE OR REPLACE NETWORK POLICY ANALYST_POLICY

    ALLOWED_IP_LIST = ('10.1.1.20');

    Buy Now
    Questions 48

    What integration object should be used to place restrictions on where data may be exported?

    Options:

    A.

    Stage integration

    B.

    Security integration

    C.

    Storage integration

    D.

    API integration

    Buy Now
    Exam Code: ARA-R01
    Exam Name: SnowPro Advanced: Architect Recertification Exam
    Last Update: Nov 15, 2024
    Questions: 162
    $57.75  $164.99
    $43.75  $124.99
    $36.75  $104.99
    buy now ARA-R01