0

I have some utility class for internal use which downloads files from an FTP server. In the past all of these have been flat text files and it has worked without any problems. However, we now have some compressed file I want it to download. When this file is written locally the size changes from ~80 Kb's to ~140 Kb's. Why does the file get corrupted during the write process? Why doesn't this apply to normal text files? What changes are required to make this code work for all file types?

        private static string GetFile(string file, bool inBound)
        {
            string path;
            if (inBound)
                path = Config.settings["FTPIn"] + file;
            else
                path = Config.settings["FTPOut"] + file;

            FtpWebRequest request = (FtpWebRequest)WebRequest.Create(path);
            request.Method = WebRequestMethods.Ftp.DownloadFile;

            request.Credentials = new NetworkCredential(Config.Settings["FTPUser"], Config.Settings["FTPPass"]);
            FtpWebResponse response = (FtpWebResponse)request.GetResponse();
            string file_contents = System.String.Empty;

            using (Stream responseStream = response.GetResponseStream())
            {
                using (StreamReader reader = new StreamReader(responseStream))
                {

                    try
                    {

                        file_contents = reader.ReadToEnd();
                    }
                    catch (Exception e)
                    {
                        ResultLogger.LogVerbose(trGuid, "Exception while getting file from FTP share.");
                        ResultLogger.LogVerbose(trGuid, e.Message);
                        ResultLogger.LogVerbose(trGuid, e.StackTrace);
                    }
                    reader.Close();
                    response.Close();
                }
            }

            return file_contents;

        }

Note there are no runtime errors (at least not until some other code attempts to decompress the file), the file is simply being corrupted during the write.

Also, the calling code is below;

      string file_contents = GetFile(fileName, inBound);
      File.WriteAllText(@".\" + fileName, file_contents);
      return new FileInfo(@".\" + fileName);
evanmcdonnal
  • 46,131
  • 16
  • 104
  • 115

2 Answers2

1

You don't want to read the raw data into a string, you should be reading it into a byte array. The .NET string is made up of two-byte characters.

You should use a BinaryReader, or better yet, write it directly to the file, without reading it into memory first, using CopyTo()

private static void GetFile(string file, bool inBound)
{
    // Removed the FTP setup code

    using (Stream responseStream = response.GetResponseStream())
    {
        using (Stream outputStream = File.OpenWrite(@".\" + file))
        {
            try
            {
                responseStream.CopyTo(outputStream);
            }
            catch (Exception e)
            {
                // Removed Exception code
            }
        }
    }
}
Steve Czetty
  • 6,147
  • 9
  • 39
  • 48
  • I have to step into a meeting but I'll try this out later today when I have time. I have a feeling it will solve my problems. – evanmcdonnal Oct 01 '13 at 17:58
0

Your problem is that you need to set request.UseBinary = true; when downloading binary files. Otherwise, the FTP stream will translate the invalid binary values into valid string values to send over the wire. This is what is causing the increase in size as well as the "corrupted" file when you download because you are not currently accounting for that difference. The code should look like this:

UPDATE: I just noticed that you were returning it as a string as well. I modified the code to properly read and populate a buffer to return for the binary file.

    private static byte[] GetFile(string file, bool inBound)
{
    string path;
    if (inBound)
        path = Config.settings["FTPIn"] + file;
    else
        path = Config.settings["FTPOut"] + file;

    FtpWebRequest request = (FtpWebRequest)WebRequest.Create(path);
    request.Method = WebRequestMethods.Ftp.DownloadFile;
    request.UseBinary = true;

    request.Credentials = new NetworkCredential(Config.Settings["FTPUser"], Config.Settings["FTPPass"]);
    FtpWebResponse response = (FtpWebResponse)request.GetResponse();
    byte[] file_contents = null;

    using (Stream responseStream = response.GetResponseStream())
    {
        byte[] buffer = new byte[2048];
        using (MemoryStream ms = new MemoryStream())
        {
            int readCount = responseStream.Read(buffer, 0, buffer.Length);
            while (readCount > 0)
            {
                ms.Write(Buffer, 0, readCount);
                readCount = responseStream.Read(buffer, 0, buffer.Length);
            }

            file_contents = ms.ToArray();
        }
    }

    return file_contents;
}
Adam Gritt
  • 2,654
  • 18
  • 20
  • That didn't solve the problem. I think it could be a step in the right direction but I'm thinking I need to change all the file handling code to work with a byte stream/array rather than strings or something to that effect. This question http://stackoverflow.com/questions/3196226/ftpwebrequest-download-file-incorrect-size gives me the impression that the problem is likely due to have the file contents stored as a string then writing that string. Also, how would this impact the handling of non-binary files? Does it need to be conditional? – evanmcdonnal Oct 01 '13 at 17:56
  • @evanmcdonnal You would have to know what type of file you are pulling down (Binary or Non-Binary) and then call the appropriate method to do so. At that point it would be best to save right to the file system instead of returning the data from the method or treat both methods as returning a byte array. – Adam Gritt Oct 01 '13 at 17:58