Snowflake COF-C02 Real Exam Questions
The questions for COF-C02 were last updated at Nov 20,2024.
- Exam Code: COF-C02
- Exam Name: SnowPro Core Certification Exam
- Certification Provider: Snowflake
- Latest update: Nov 20,2024
What is the default character set used when loading CSV files into Snowflake?
- A . UTF-8
- B . UTF-16
- C . ISO S859-1
- D . ANSI_X3.A
A
Explanation:
https://docs.snowflake.com/en/user-guide/intro-summary-loading.html#:~:text=For%20delimited%20files%20(CSV%2C%20TSV,encoding%20to%20use%20for% 20loading.
For delimited files (CSV, TSV, etc.), the default character set is UTF-8. To use any other characters sets, you must explicitly specify the encoding to use for loading. For the list of supported character sets, see Supported Character Sets for Delimited Files (in this topic).
Which statement about billing applies to Snowflake credits?
- A . Credits are billed per-minute with a 60-minute minimum
- B . Credits are used to pay for cloud data storage usage
- C . Credits are consumed based on the number of credits billed for each hour that a warehouse runs
- D . Credits are consumed based on the warehouse size and the time the warehouse is running
D
Explanation:
Snowflake credits are used to pay for the consumption of resources on Snowflake. A Snowflake credit is a unit of measure, and it is consumed only when a customer is using resources, such as when a virtual warehouse is running, the cloud services layer is performing work, or serverless features are used.
https://docs.snowflake.com/en/user-guide/what-are-credits.html
What is the minimum Snowflake edition required to create a materialized view?
- A . Standard Edition
- B . Enterprise Edition
- C . Business Critical Edition
- D . Virtual Private Snowflake Edition
B
Explanation:
Materialized views require Enterprise Edition. To inquire about upgrading, please contact Snowflake Support
https://docs.snowflake.com/en/sql-reference/sql/create-materialized-view.html#:~:text=Materialized%20views%20require%20Enterprise%20Edition,upgrading%2C%20please%20contact%20Snowflake%20Support.
A user is loading JSON documents composed of a huge array containing multiple records into Snowflake. The user enables the strip__outer_array file format option.
What does the STRIP_OUTER_ARRAY file format do?
- A . It removes the last element of the outer array.
- B . It removes the outer array structure and loads the records into separate table rows,
- C . It removes the trailing spaces in the last element of the outer array and loads the records into separate table columns
- D . It removes the NULL elements from the JSON object eliminating invalid data and enables the ability to load the records
B
Explanation:
Data Size Limitations
The VARIANT data type imposes a 16 MB size limit on individual rows.
For some semi-structured data formats (e.g. JSON), data sets are frequently a simple concatenation of multiple documents. The JSON output from some software is composed of a single huge array containing multiple records. There is no need to separate the documents with line breaks or commas, though both are supported.
If the data exceeds 16 MB, enable the STRIP_OUTER_ARRAY file format option for the COPY INTO <table> command to remove the outer array structure and load the records into separate table rows: copy into <table>
from @~/<file>.json
file_format = (type = ‘JSON’ strip_outer_array = true);
https://docs.snowflake.com/en/user-guide/semistructured-considerations.html
True or False: When you create a custom role, it is a best practice to immediately grant that role to ACCOUNTADMIN.
- A . True
- B . False
B
Explanation:
Reference: https://docs.snowflake.com/en/user-guide/security-access-control-considerations.html
What Snowflake role must be granted for a user to create and manage accounts?
- A . ACCOUNTADMIN
- B . ORGADMIN
- C . SECURITYADMIN
- D . SYSADMIN
A
Explanation:
https://docs.snowflake.com/en/user-guide/security-access-control-considerations.html
Which Snowflake objects track DML changes made to tables, like inserts, updates, and deletes?
- A . Pipes
- B . Streams
- C . Tasks
- D . Procedures
B
Explanation:
https://dataterrain.com/how-to-change-tracking-using-table-streams-in-snowflake/#:~:text=A%20stream%20is%20a%20Snowflake,as%20metadata%20about%20each%20ch ange.
What is the recommended file sizing for data loading using Snowpipe?
- A . A compressed file size greater than 100 MB, and up to 250 MB
- B . A compressed file size greater than 100 GB, and up to 250 GB
- C . A compressed file size greater than 10 MB, and up to 100 MB
- D . A compressed file size greater than 1 GB, and up to 2 GB
A
Explanation:
https://www.phdata.io/blog/how-to-optimize-snowpipe-data-load/#:~:text=Snowpipe%20is%20typically%20used%20to,data%20within%20one%2Dminute%20int ervals.
What tasks can be completed using the copy command? (Select TWO)
- A . Columns can be aggregated
- B . Columns can be joined with an existing table
- C . Columns can be reordered
- D . Columns can be omitted
- E . Data can be loaded without the need to spin up a virtual warehouse
When is the result set cache no longer available? (Select TWO)
- A . When another warehouse is used to execute the query
- B . When another user executes the query
- C . When the underlying data has changed
- D . When the warehouse used to execute the query is suspended
- E . When it has been 24 hours since the last query