1

I have a few SQL server tables, one with 200,000 rows and the other 900,000 rows, both with maybe 50 columns. I need to download them to a csv file, and within VBA script (it's inside Excel). All rows and all columns are required to be exported.

Presume you have a recordset oRec taken from a simple query. All columns are varchars, most are either varchar(20) or varchar(200)

'SELECT * FROM TABLE'.

In the below code I loop through the recordset and then through each column. If I comment out column looping or just get the first column it's pretty fast. But every column is exponentially making it worse to taking over an hour.

Is there a better way? I need every row and column and VBA/Excel is the only way as it's part of a system within Excel to process data. This is for end users and so they don't have sql tools, libraries (just office) and I cannot distribute applications. This export is just one tiny part of a larger Excel automation project.

While Not oRec.EOF
    sLine = ""
    LineCount = LineCount + 1
        For Each col In oRec.Fields

            If IsNull(col.value) Then
                sVal = "NULL"
            Else
                sVal = col.value
            End If

            If Not bHaveHeader Then
                sHeader = sHeader & CStr(col.name) & ","
                sLine = sLine & "," & sVal
            Else
                sLine = sLine & "," & sVal
            End If
            colOffset = colOffset + 1
        Next
        colOffset = 0
        If bHaveHeader Then
            sLines = sLines & Right(sLine, Len(sLine) - 1) & vbCrLf
        Else
            If LineCount Mod EXPORT_LINE_BUFFER = 0 Then
                'lines ready to be exported. last line can't have carriage return
                sLines = sHeader & vbCrLf & sLines & Right(sLine, Len(sLine) - 1)
                Call UpdateFile(fs, sExportFile, sLines)
                sLines = ""
            Else
                sLines = sHeader & vbCrLf & sLines & Right(sLine, Len(sLine) - 1) & vbCrLf
            End If
        End If

        bHaveHeader = True

    oRec.MoveNext
Wend
Neil
  • 357
  • 2
  • 10
  • 2
    Why do you need to do this with VBA? If you do need to use a CSV, I would suggest there are better options – Thom A Feb 28 '20 at 17:21
  • Please define columns as they may carry high overhead such as `blob` , high precision numbers, and max `varchar`. Please also show example of *when I put in a fairly small set (e.g. 3000 rows)*. – Parfait Feb 28 '20 at 17:37
  • 1
    What happens in VBA after you open the recordset? Are you saying it takes 30min to get past that `Execute` line? – Tim Williams Feb 28 '20 at 17:37
  • @Neil, is that 64-bit or 32-bit Excel? And please add the code that copies the recordset data into the Excel sheet. – HTTP 410 Feb 28 '20 at 17:37
  • Right, sorry, my fault, I made a right botch of the question. didn't profile it correctly. It's not the query at fault, it's the column looping. I'll update the question. – Neil Feb 28 '20 at 17:45
  • 1
    If you're building up a CSV string to write to a file, then skip the concatenation and write directly to the file as you loop. – Tim Williams Feb 28 '20 at 17:55
  • 2
    But still, why VBA? Wouldn't `bcp` or a proper ETL tool just be better? – Thom A Feb 28 '20 at 17:57
  • 1
    @larnu this is enterprise land and it's given to normal users. Not allowed to create or distribute applications and they don't have tools like bcp. – Neil Feb 28 '20 at 17:59
  • But *normal* users are allowed to extract the **entire** table? I admit the users at the office don't have access to `bcp`, but also they don't have access to extract the entire tables at a whim. If they did, we'd supply some form of application layer to do that; where they wouldn't be able to edit the code (because they would be able to with a VBA Macro). – Thom A Feb 28 '20 at 18:01
  • Maybe try looking at `https://www.w3schools.com/asp/met_rs_getstring.asp` Not sure how well it performs with huge recordsets though. – Tim Williams Feb 28 '20 at 18:14
  • @larnu this data they receive is a forecast that they run on data they already own – Neil Feb 28 '20 at 18:27
  • @TimWilliams Thanks, I'll give it a try. My current trial is extracting all the columns names and concatenating them as a single string, i.e. return one column per row. I suspect it'll be costly as I have to cast and convert nulls... – Neil Feb 28 '20 at 18:29
  • `GetString` will handle nulls for you - just pass it what you want them to be represented as. Any performance changes though will likely be mostly dependent on what your current value for EXPORT_LINE_BUFFER is, and how wide your table is (ie. how much concatenation are you actually doing). You could also look into using a StringBuilder class instead of using concatenation: that will also be faster. Example here - https://stackoverflow.com/questions/1070863/hidden-features-of-vba – Tim Williams Feb 28 '20 at 18:31
  • It works, about 10 minutes and I worry of memory. However, if I use the 'get data' option inside Excel to connect to my database and load data, it takes <1 minute to retrieve all the data into a sheet, so there must be a better way than my code. I guess I could automate the get data option if all else fails... – Neil Feb 28 '20 at 19:48

0 Answers0