SuperBulkCopy (and the command-line sbcp) is one of our newest (free) which comes to replace the traditional usage of both 'BULK INSERT' and 'bcp' with something much more powerful, yet simple.
If you've ever worked with either, trying to turn text-based files into a SQL Table, you are probably quite aware of the various limitations. SuperBulkCopy was written to overcome those limitations and provide a quick working solution.
OK, so that was the quick refresh. Now, there's a new version which just came out, and in addition to the
existing abilities, it introduces the following:
- Gzip support! You can now bulk import directly from a .gz file
This is a big one -> often the input files are compressed and traditional import tools usually require a preliminary step of uncompromising everything to a different folder.
SuperBulkCopy will detect those compressed files and will extract the stream in-memory, quickly importing the uncompressed data.
- Wildcard support! Now supporting wildcard* input files
Have multiple files in a folder which all needs to be imported? No problem -> the wildcard feature will take care of achieving it all in a single run. Whether to the same table or different tables.
If you wish to use different table names, see the item below!
- Output table pattern support!
This feature helps you customize the output table name. Based on input file names, dates, unique id's and more.
The supported patterns are:
%ifn -> Input Filename
%timestamp -> Timestamp (yyyy-mm-dd-hh-mm-ss)
%date -> yyyy-mm-dd
For example, if Output Table Name is "my_table_%ifn", and input file name is "my_file.csv" then the table will be named "my_table_my_file_csv"
- Added a new custom field generation -> adding input filename as an additional field to the output table/file
This new feature allows you to add custom fields into the destination table/s.
The current supported fields are:
- Input Filename
- Input Directory
More cool things to come -- in the meantime, enjoy this new version!