Snowflake’s Data Cloud is powered by an advanced data platform provided as Software-as-a-Service (SaaS). Snowflake enables data storage, processing, and analytic solutions that are faster, easier to use, and far more flexible than traditional offerings.
The Snowflake data platform is not built on any existing database technology or “big data” software platforms such as Hadoop. Instead, Snowflake combines a completely new SQL query engine with an innovative architecture natively designed for the cloud. To the user, Snowflake provides all of the functionality of an enterprise analytic database, along with many additional special features and unique capabilities.
SnowSQL is the next-generation command line client for connecting to Snowflake to execute SQL queries and perform all DDL and DML operations, including loading data into and unloading data out of database tables
Steps for setting up SnowSQL
- Download the installation files using the link Snowflake Client Repository.
- Once installation is complete verify the installation by opening a command prompt window and typing the cmd “snowsql”. You should see the snowsql options as shown below

3. Next we need to login. In this article I will focus on login using SSO using Azure AD. To get a detailed overview of the various authentication methods please refer to the Snowflake documentation here. Type the command below in the windows command prompt which you used to verify SnowSQL installation in step 2.
snowsql -a youraccountname -u yourusername@yourdomain.com --authenticator externalbrowser
To get the details like account name , username , region etc use the following commands in the default browser based sql editor (worksheets) for Snowflake
select current_region();
select current_account();
show global accounts like 'myaccount%';
4. Once connected you need to navigate to the correct role, warehouse, database and schema. Below are the commands you can use to do so in SnowSQL.
use role role_name;
use WAREHOUSE warehouse_name;
use DATABASE database_name;
use SCHEMA schema_name;
5. Lets pick the scenario of loading data from flat files using SnowSQL. First we need to create a staging area. Use the below command to do so.
create stage your_stage_name;
Make sure you see the message below each time you execute any of the commands shown above
+———————————-+
| status |
|———————————-|
| Statement executed successfully. |
+———————————-+
6. Now using the PUT command we can load a local file into the staging area.
put file://C:\temp\Book1.csv @your_stage_name
+———–+————–+————-+————-+——————–+——————–+———-+———+
| source | target | source_size | target_size | source_compression | target_compression | status | message |
|———–+————–+————-+————-+——————–+——————–+———-+———|
| Book1.csv | Book1.csv.gz | 42 | 71 | NONE | GZIP | UPLOADED | |
+———–+————–+————-+————-+——————–+——————–+———-+———+
1 Row(s) produced. Time Elapsed: 3.872s
7. Now go to your worksheets in your browser where you have logged into Snowflake UI and run the below commands to load the data from Stage to an actual table.
CREATE TABLE book1_demo (col1 STRING, col2 STRING);
CREATE FILE FORMAT CSV_COMMA
TYPE=csv
field_delimiter= ','
skip_header=1;
copy into book1_demo
from @ your_stage_name
file_format=CSV_COMMA;
select * from book1_demo;
I found it a little confusing to follow the documentation and figure this out as loading data is a critical piece when working with Snowflake DW. Using SnowSQL make a lot of the tasks fast and efficient and programmer friendly.