0

I have a file more than 8 GB. want to load this file to snowflake. I was going through the snowflake documentation and found the best practices which says keep file size 10 MB to 100 MB for best load performance.

https://docs.snowflake.net/manuals/user-guide/data-load-considerations-prepare.html

Is that possible to split the file in snowflake itself? So I will upload 8 GB file to Azure Blob and then will use snowflake to split the file into multiple and then load into a table..?

Shivenndoo
  • 89
  • 1
  • 9

2 Answers2

1

No, it's not possible to split a file using Snowflake before loading the file.
Snowflake only has the ability to split into multiple files when unloading a table to cloud storage.

But I guess there are possibilities within Azure:
Azure Batch Job How to split large file into smaller files

Hans Henrik Eriksen
  • 2,690
  • 5
  • 12
0

I would add although it's not currently possible today in Snowflake, you are more than welcome to submit a feature request here: https://community.snowflake.com/s/ideas

Suzy Lockwood
  • 1,050
  • 4
  • 6