A company is following the Data Mesh principles, including domain separation, and chose one Snowflake account for its data platform.
An Architect created two data domains to produce two data products. The Architect needs a third data domain that will use both of the data products to create an aggregate data product. The read access to the data products will be granted through a separate role.
Based on the Data Mesh principles, how should the third domain be configured to create the aggregate product if it has been granted the two read roles?
A. Request a technical ETL user with the sysadmin role.
B. Create a hierarchy between the two read roles.
C. Use secondary roles for all users.
D. Request that the two data domains share data using the Data Exchange.
正解:D
解説: (Pass4Test メンバーにのみ表示されます)
質問 2:
A user can change object parameters using which of the following roles?
A. SYSADMIN, SECURITYADMIN
B. ACCOUNTADMIN, SECURITYADMIN
C. SECURITYADMIN, USER with PRIVILEGE
D. ACCOUNTADMIN, USER with PRIVILEGE
正解:D
解説: (Pass4Test メンバーにのみ表示されます)
質問 3:
An Architect needs to design a Snowflake account and database strategy to store and analyze large amounts of structured and semi-structured data. There are many business units and departments within the company. The requirements are scalability, security, and cost efficiency.
What design should be used?
A. Create a single Snowflake account and database for all data storage and analysis needs, regardless of data volume or complexity.
B. Use a centralized Snowflake database for core business data, and use separate databases for departmental or project-specific data.
C. Use Snowflake's data lake functionality to store and analyze all data in a central location, without the need for structured schemas or indexes
D. Set up separate Snowflake accounts and databases for each department or business unit, to ensure data isolation and security.
正解:B
解説: (Pass4Test メンバーにのみ表示されます)
質問 4:
An Architect uses COPY INTO with the ON_ERROR=SKIP_FILE option to bulk load CSV files into a table called TABLEA, using its table stage. One file named file5.csv fails to load. The Architect fixes the file and re-loads it to the stage with the exact same file name it had previously.
Which commands should the Architect use to load only file5.csv file from the stage? (Choose two.)
A. COPY INTO tablea FROM @%tablea NEW_FILES_ONLY = TRUE;
B. COPY INTO tablea FROM @%tablea MERGE = TRUE;
C. COPY INTO tablea FROM @%tablea RETURN_FAILED_ONLY = TRUE;
D. COPY INTO tablea FROM @%tablea FORCE = TRUE;
E. COPY INTO tablea FROM @%tablea;
F. COPY INTO tablea FROM @%tablea FILES = ('file5.csv');
正解:E,F
解説: (Pass4Test メンバーにのみ表示されます)
質問 5:
A global company needs to securely share its sales and Inventory data with a vendor using a Snowflake account.
The company has its Snowflake account In the AWS eu-west 2 Europe (London) region. The vendor's Snowflake account Is on the Azure platform in the West Europe region. How should the company's Architect configure the data share?
A. 1. Create a share.
2. Add objects to the share.
3. Add a consumer account to the share for the vendor to access.
B. 1. Create a share.
2. Create a reader account for the vendor to use.
3. Add the reader account to the share.
C. 1. Create a new role called db_share.
2. Grant the db_share role privileges to read data from the company database and schema.
3. Create a user for the vendor.
4. Grant the ds_share role to the vendor's users.
D. 1. Promote an existing database in the company's local account to primary.
2. Replicate the database to Snowflake on Azure in the West-Europe region.
3. Create a share and add objects to the share.
4. Add a consumer account to the share for the vendor to access.
正解:A
解説: (Pass4Test メンバーにのみ表示されます)
質問 6:
A company has several sites in different regions from which the company wants to ingest data.
Which of the following will enable this type of data ingestion?
A. The company must have a Snowflake account in each cloud region to be able to ingest data to that account.
B. The company should use a storage integration for the external stage.
C. The company should provision a reader account to each site and ingest the data through the reader accounts.
D. The company must replicate data between Snowflake accounts.
正解:B
解説: (Pass4Test メンバーにのみ表示されます)
質問 7:
What is a characteristic of loading data into Snowflake using the Snowflake Connector for Kafka?
A. The Connector works with all file formats, including text, JSON, Avro, Ore, Parquet, and XML.
B. The Connector only works in Snowflake regions that use AWS infrastructure.
C. The Connector creates and manages its own stage, file format, and pipe objects.
D. Loads using the Connector will have lower latency than Snowpipe and will ingest data in real time.
正解:C
解説: (Pass4Test メンバーにのみ表示されます)
質問 8:
A user, analyst_user has been granted the analyst_role, and is deploying a SnowSQL script to run as a background service to extract data from Snowflake.
What steps should be taken to allow the IP addresses to be accessed? (Select TWO).
A. ALTERUSERANALYST_USERSETNETWORK_POLICY='10.1.1.20';
B. ALTERROLEANALYST_ROLESETNETWORK_POLICY='ANALYST_POLICY';
C. ALTERUSERANALYSTJJSERSETNETWORK_POLICY='ANALYST_POLICY';
D. USE ROLE SECURITYADMIN;
CREATE OR REPLACE NETWORK POLICY ANALYST_POLICY ALLOWED_IP_LIST =
('10.1.1.20');
E. USE ROLE USERADMIN;
CREATE OR REPLACE NETWORK POLICY ANALYST_POLICY
ALLOWED_IP_LIST = ('10.1.1.20');
正解:C,D
解説: (Pass4Test メンバーにのみ表示されます)
新家** -
このARA-C01問題集を買って勉強をしようと考えました。こんな俺が1ヵ月の勉強のみで合格できたので
是非参考にして合格し就活や転職の成功の足しにしてくれ。
Pass4Testさん、本当に感謝してます!