1

I have a page that can upload csv files and saved to table in the database. I'm using below connection string to read a csv file.

set connection = Server.CreateObject ("ADODB.Connection")
connection.Open Connection_string

ls_map_path = server.MapPath(as_path)

strConn = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" & _
ls_map_path & ";Extended Properties=""Text;HDR=Yes;FMT=Delimited"";"

Set lo_rs = Server.CreateObject("ADODB.recordset")
lo_rs.CursorLocation = adUseClient
lo_rs.open "SELECT * FROM " & as_file_name, lo_conn, adOpenStatic, adLockOptimistic, adCmdText

do while not lo_rs.eof
     ls_notes = lo_rs.fields.Item("notes").value
     uf_save_note(ls_notes) 
     lo_rs.movenext
loop

The lo_rs.fields.Item("notes").value always truncated up to 300 characters. I want to read this field w/out limitation.

Please advise. Thanks in advance!

jeff
  • 448
  • 1
  • 15
  • 34
  • just as an experiment upload a file where the first row has a over 300 chars in that field, i think it may be an issue with auto data type stuff.. digging for you ..;-) –  Sep 14 '11 at 08:49

1 Answers1

1

I think the issue may be to do with automatic datatype settings, jet will sample the rows in the csv and create a recordset to fit.. if the rows it samples are < 300 chars it wont give you enough room.

I think there is a registry key you can change to make it analyse more rows, this link is talking about excel but i think its related:

http://support.microsoft.com/kb/281517

could be wrong.. its not something I have actually ever experienced myself.

from the article:

"The valid range of values for the TypeGuessRows key is 0 to 16. However, if the value is 0, the number of source rows scanned is 16384. Keep in mind that a value of zero may cause a small performance hit if the source file is very large."

To see if this is the problem at all the best test would be to open a CSV where the first row contains over 300 characters.

If this is the problem, my preferred solution would be to specify the column types of the recordset prior to population but I don't have a set up here to test if that's possible sorry. I will have a look for some references for you though.

  • as a hack (yes i have work with classic asp ;-) ) you could fudge in a big row yourself at the start and then skip it when reading in the data.. or maybe you could configure the recordset prior to population (i cant remember if that works in classic asp) –  Sep 14 '11 at 08:55
  • Hi gardatron, I tried to edit the registry "TypeGuessRows" and change it to "16" decimal but the problem still occurs. The csv consists of 30 columns and "notes" that contain large text value is at 30th position should I change position of the notes in the range of 1-16 position as you're saying a hack? thanks a lot! – jeff Sep 15 '11 at 00:05
  • to see if this is the issue at all just open a csv that contains over 300 chars in the first row, it does not matter which column. –  Sep 15 '11 at 08:16