Working with Snowflake DW using SnowSQL

Snowflake’s Data Cloud is powered by an advanced data platform provided as Software-as-a-Service (SaaS). Snowflake enables data storage, processing, and analytic solutions that are faster, easier to use, and far more flexible than traditional offerings.

The Snowflake data platform is not built on any existing database technology or “big data” software platforms such as Hadoop. Instead, Snowflake combines a completely new SQL query engine with an innovative architecture natively designed for the cloud. To the user, Snowflake provides all of the functionality of an enterprise analytic database, along with many additional special features and unique capabilities.

SnowSQL is the next-generation command line client for connecting to Snowflake to execute SQL queries and perform all DDL and DML operations, including loading data into and unloading data out of database tables

Steps for setting up SnowSQL

  1. Download the installation files using the link Snowflake Client Repository.
  2. Once installation is complete verify the installation by opening a command prompt window and typing the cmd “snowsql”. You should see the snowsql options as shown below

3. Next we need to login. In this article I will focus on login using SSO using Azure AD. To get a detailed overview of the various authentication methods please refer to the Snowflake documentation here. Type the command below in the windows command prompt which you used to verify SnowSQL installation in step 2.

snowsql -a youraccountname -u yourusername@yourdomain.com --authenticator externalbrowser

To get the details like account name , username , region etc use the following commands in the default browser based sql editor (worksheets) for Snowflake

select current_region();
select current_account();
show global accounts like 'myaccount%';

4. Once connected you need to navigate to the correct role, warehouse, database and schema. Below are the commands you can use to do so in SnowSQL.

use role role_name;
use WAREHOUSE warehouse_name;
use DATABASE database_name;
use SCHEMA schema_name;

5. Lets pick the scenario of loading data from flat files using SnowSQL. First we need to create a staging area. Use the below command to do so.

create stage your_stage_name;

Make sure you see the message below each time you execute any of the commands shown above

+———————————-+
| status |
|———————————-|
| Statement executed successfully. |
+———————————-+

6. Now using the PUT command we can load a local file into the staging area.

put file://C:\temp\Book1.csv @your_stage_name

+———–+————–+————-+————-+——————–+——————–+———-+———+
| source | target | source_size | target_size | source_compression | target_compression | status | message |
|———–+————–+————-+————-+——————–+——————–+———-+———|
| Book1.csv | Book1.csv.gz | 42 | 71 | NONE | GZIP | UPLOADED | |
+———–+————–+————-+————-+——————–+——————–+———-+———+
1 Row(s) produced. Time Elapsed: 3.872s

7. Now go to your worksheets in your browser where you have logged into Snowflake UI and run the below commands to load the data from Stage to an actual table.

CREATE TABLE book1_demo (col1 STRING, col2 STRING);


CREATE FILE FORMAT CSV_COMMA
TYPE=csv
field_delimiter= ','
skip_header=1;


copy into book1_demo
from @ your_stage_name
file_format=CSV_COMMA;


select * from book1_demo;

I found it a little confusing to follow the documentation and figure this out as loading data is a critical piece when working with Snowflake DW. Using SnowSQL make a lot of the tasks fast and efficient and programmer friendly.

Published by Narayan Sujay Somasekhar

• 12+ years of experience leading the build of BI and Cloud Data Platform solutions using cloud technologies such as Snowflake, Azure Synapse, Databricks and AWS Redshift. • Over 8+ years as a Data Analytics and Engineering practice leader with demonstrated history of working with management consulting firms across Tax & Accounting, Finance, Power & Utility industry. • Experience in managing the team roadmap, and delivering actionable data insights to sales, product, marketing, and senior leadership. • Strong background in Data Technology Solutions delivery & Data Automation for business processes using various tools. • Expertise in bringing Data-Driven IT Strategic Planning to align metrics, communicate data changes across reporting, Enterprise Data Warehouses, Data Lakes and Customer Relationship Managements Systems. • Experienced working with cross functional teams, Data Scientists/Analysts and Business Managers in building Data Science and Data Engineering practice from the ground up. • Experienced in Designing and implementing NLP solutions with focus on sentiment analysis, opinion mining, key phase extraction using Azure Cognitive Services and Amazon Comprehend • Extensive programming experience with SQL, Python, C#, R, and Scala.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: