4

I have a MySQL database "DB" in which there's a table "TABLE" of a single field and it contains about 8 Million rows. I am trying to download that table into a text file as follows:

//Open connection.
connection.Open();

//Create command.
MySqlCommand command = connection.CreateCommand();
command.CommandText = "SELECT * FROM `DB`.`TABLE`";

//Execute command.
MySqlDataReader result = command.ExecuteReader();

//If result isn't null
if (result != null)
{
    //Open stream to write result.
    System.IO.StreamWriter file = new System.IO.StreamWriter(@"C:\Users\StackOverflow\Desktop\ID_List.txt", true);

    //For each line row of result.
    while (result.Read())
    {
        //Write result in a line.
        file.WriteLine(result.GetValue(0).ToString());
    }
}

//Close connection.
connection.Close();

Upon running it starts downloading the data and outputing it to the text file but after only a minute or two it gives:

An unhandled exception of type 'MySql.Data.MySqlClient.MySqlException' occurred in test program.exe

Additional information: Timeout expired.  The timeout period elapsed prior to completion of the operation or the server is not responding.

What Might I change or do differently for it to work? Thanks for any suggestions and answers :)

[EDIT]: I added command.CommandTimeout = 240; and tested three times, every time it downloaded for a couple minutes (about 40MB of data each time) before giving:

An unhandled exception of type 'MySql.Data.MySqlClient.MySqlException' occurred in test program.exe

Additional information: Fatal error encountered during data read.

And these are the last few lines of the debug output:

at MySql.Data.MySqlClient.NativeDriver.FetchDataRow(Int32 statementId, Int32 columns) in :line 0
   at MySql.Data.MySqlClient.Driver.FetchDataRow(Int32 statementId, Int32 columns) in :line 0
   at MySql.Data.MySqlClient.ResultSet.GetNextRow() in :line 0
   at MySql.Data.MySqlClient.ResultSet.NextRow(CommandBehavior behavior) in :line 0
   at MySql.Data.MySqlClient.MySqlDataReader.Read() in :line 0</ExceptionString><InnerException><ExceptionType>System.IO.EndOfStreamException, mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089</ExceptionType><Message>Attempted to read past the end of the stream.</Message><StackTrace>   at MySql.Data.MySqlClient.MySqlStream.ReadFully(Stream stream, Byte[] buffer, Int32 offset, Int32 count) in :line 0
   at MySql.Data.MySqlClient.MySqlStream.LoadPacket() in :line 0</StackTrace><ExceptionString>System.IO.EndOfStreamException: Attempted to read past the end of the stream.
   at MySql.Data.MySqlClient.MySqlStream.ReadFully(Stream stream, Byte[] buffer, Int32 offset, Int32 count) in :line 0
   at MySql.Data.MySqlClient.MySqlStream.LoadPacket() in :line 0</ExceptionString></InnerException></InnerException></Exception></TraceRecord>
An unhandled exception of type 'MySql.Data.MySqlClient.MySqlException' occurred in test program.exe
Additional information: Fatal error encountered during data read.
The program '[4548] test program.exe: Program Trace' has exited with code 0 (0x0).
The program '[4548] test program.exe: Managed (v4.0.30319)' has exited with code 0 (0x0).

[EDIT2]: I added MySqlDataReader result = command.ExecuteReader(System.Data.CommandBehavior.SingleResult); in an effort to remedy the above error and now I am getting:

An unhandled exception of type 'MySql.Data.MySqlClient.MySqlException' occurred in test program.exe
Additional information: Query execution was interrupted

After some light google-ing I figure it is GoDaddy that is limiting the query execution time. Therefore a working solution is to have try/catch with a counter to keep track of how many records were retrieved before the connection was closed and to re-open new connections with the counter as starting point to LIMIT until the table size is exhausted.

Thanks for all the help!

Pi_
  • 2,010
  • 5
  • 22
  • 24

3 Answers3

2

Increase the wait_timeout parameter in my.ini file.

http://dev.mysql.com/doc/refman/5.0/en/server-system-variables.html#sysvar_wait_timeout

There are other timeout parameters that can be the problem, but I think it is the one.

If you don't have access to your ini file, use the "limit" to get a limited result set.

First, get total row count (count(*) ) Second, get small result set thanks to a for

 for (int i=0; i<=numberOfRowsInYourTable ; i=i+1000){
      command.CommandText = "SELECT * FROM `DB`.`TABLE` LIMIT " + i + " " + new String(i + 1000);
      // Do what you want
 }
kmas
  • 6,401
  • 13
  • 40
  • 62
  • I don't have access to the server settings, that database is on a godaddy server. – Pi_ Jul 08 '13 at 12:04
  • It is usually quicker to have small requests (10 000 rows for example) than to get everything in one request. – kmas Jul 09 '13 at 07:13
  • I have it get a many as it can and keep a counter of how many that is; when the exception occur I re-open a connection and continue from there on. I tried doing what you suggested using 1k 10k and 100k intervals but it was actually slower. Nonetheless thanks for your help. – Pi_ Jul 09 '13 at 11:29
  • Have you got good indexes ? It may be the problem with LIMIT. – kmas Jul 09 '13 at 11:48
  • The table is indexed on its only field which is also the PK. – Pi_ Jul 09 '13 at 12:13
  • 1
    What can speed up is to write result once only at the end of your export, then you use once write function which can be time consuming (if it works in the same way as in other languages as java or php that I know better). You fill a buffer that you write only at the end in your file. – kmas Jul 09 '13 at 12:24
  • Good idea, I'm going to try thanks. I'll let you know once done. – Pi_ Jul 09 '13 at 12:29
1

Add this to your connection string:

default command timeout=240

and adjust it as necessary.

Mike Perrenoud
  • 66,820
  • 29
  • 157
  • 232
  • Hello and thank you, this fixed the timeout exception but a new exception arises, see original post. – Pi_ Jul 08 '13 at 12:24
  • 1
    @Pi_, there's got to be more information than that? What's in the inner exception? That error is useless. – Mike Perrenoud Jul 08 '13 at 12:27
  • I added the last couple of lines from the debug output but aside from that I don't see any extra information given. – Pi_ Jul 08 '13 at 12:37
  • 1
    @Pi_, there actually is some **really** useful information in the rest of the exception. The `InnerException` is `Attempted to read past the end of the stream.`. Have a look at this SO post on how one OP resolved that issue. http://stackoverflow.com/questions/9761499/c-sharp-with-mysql-through-connector-net – Mike Perrenoud Jul 08 '13 at 12:46
  • 1
    And here is another possible SO post. http://stackoverflow.com/questions/6281304/attempted-to-read-past-end-of-the-stream-error-in-mysql?rq=1 – Mike Perrenoud Jul 08 '13 at 12:47
  • Both of these SO post have similar but not identifc exceptions; The first one is solved by closing connections (which I already do) and the second one talks of timeouts being exceeded which is not the case here (I tested with a larger timeout). – Pi_ Jul 08 '13 at 13:03
0

You have several options

  1. you can set a longer timeout that means you'll need to set the CommandTimeout parameter set in your connection string

  2. you can use SELECT INTO that would generate the file on the server

  3. you can use the mysqldump utility that would also generate the file on the server

EDIT Fellow SO users mention other types of timeout parameters. I'm not user which one is the key here

bpgergo
  • 15,669
  • 5
  • 44
  • 68