0

I have a table that i have already designed in SQL. My goal is to create a stored procedure to append data from a location into this table. It is a large data set (over 255 columns) so i cannot directly load this into MS access (front end) via VBA. I believe this only leaves with me an SQL option

I have scaled the internet for a solution to no avail.

basic code I tried below, however I am stuck and had no luck finding this.

INSERT INTO tablename (field 1, field 2)
SELECT * FROM 'Z:\temp\Cash_Activity_201910091702.csv';
Aaron Dietz
  • 10,137
  • 1
  • 14
  • 26
MCHammer
  • 3
  • 1
  • 3
  • I have already built the table in SQL and imported the flat file once. Unfortunately, it is a standard file generated from the underlying provider. I have turned to trying to run this directly in SQL now using a bulk insert statement. BULK INSERT dbname.dbo.table FROM 'location' with ( firstrow =2 fieldterminator = ',' rowterminator = '\n' tablock ) Still having no luck unfortunately. Even when i run the query it says 0 rows affected, so it must be reading the data just do performing the action. Please help – MCHammer Oct 15 '19 at 14:21

2 Answers2

0

Look at this code using ado to import csv data, maybe it can help you. In my opinion it is closest to what you are asking for. But I haven't tried it on over 255 columns

xShen
  • 452
  • 4
  • 9
  • You can't have more than 255 fields (columns) in a table. Sounds like your database is not quite normalized. So you''ll have to introduce more tables and either split up your csv (best option) or use VBA to import specific csv columns into specific table columns. – Rene Oct 14 '19 at 22:17
  • Hi Rene I think you misread Aaron Dietz question , The 255+ columns are in the CSV file . And he want to import some columns data to a table in normal ms access db. so the solution i pointed him to is valid . – xShen Oct 15 '19 at 19:16
  • Thanks guys, however, i am trying to do this from a stored procedure in SQL. I have tried different variations of this BULK INSERT Helium.dbo.[BulkUploadTestFields] --FROM 'C:\\SqlFiles\\f1915audt.ext.191008.24' --FROM '\\twe-sql02\\SqlFiles\\f1915audt.ext.191008 - Copy.24' FROM 'C:\SqlFiles\f1915audt.ext.191008 - Copy.24' WITH ( FIRSTROW = 4, FIELDTERMINATOR = ',', ROWTERMINATOR = '\r', TABLOCK--, -- format = 'csv' --fieldquote = ) – MCHammer Oct 16 '19 at 10:36
0

So i figured it out and would like to post it out there for the world.

The issue with having so many columns and field with over 15 characters long is that the 15th character in a number is lost in excel.

  1. Using VBA code, i imported the target file as a text file via VBA forcing the first column to be text. (i would recommend that you record the macro and click which columns you want as text)

  2. Using VBA code, i then normalised the data the best way, axing unnecessary columns.

  3. Once this was done, I created a Bulk Insert statement into my SQL Staging table and i called this using VBA code. File names are not handle well in SQL so i had to create an entire string just for the upload.

1.

wb.ActiveSheet.QueryTables.Add(Connection:= _
    "TEXT;"mypath & LedgerString Destination:=Range( _
    "$A$1"))

3.

@filepath nvarchar(500) AS DECLARE  @bulkinsert NVARCHAR(2000) SET @bulkinsert = 

'BULK INSERT DBName.dbo.[table] FROM ''' 
+ @filepath + 
''' WITH ' +
'( ' +
    'FIRSTROW = 2, ' +
    'FIELDTERMINATOR = '','', '+
    'ROWTERMINATOR = ''\n''' +
    'TABLOCK--, '+
') ' +
'INSERT INTO table(FIELD NAMES) '+
'SELECT DISTINCT NAMES '+
'FROM table AS s ' +
'WHERE NOT EXISTS ( ' +
  'SELECT * '+
  'FROM table As t '+
  'WHERE t.UniqueAuditID = s.UniqueAuditID '+
') '+
'delete from table '+

'INSERT INTO NewTable(FIELD NAMES) '+ 
'SELECT FIELD NAMES ' +
'FROM stagingtable'+
'WHERE fileRef = ''' 
 + @filepath + 
 ''''
MCHammer
  • 3
  • 1
  • 3