I have some utility class for internal use which downloads files from an FTP server. In the past all of these have been flat text files and it has worked without any problems. However, we now have some compressed file I want it to download. When this file is written locally the size changes from ~80 Kb's to ~140 Kb's. Why does the file get corrupted during the write process? Why doesn't this apply to normal text files? What changes are required to make this code work for all file types?
private static string GetFile(string file, bool inBound)
{
string path;
if (inBound)
path = Config.settings["FTPIn"] + file;
else
path = Config.settings["FTPOut"] + file;
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(path);
request.Method = WebRequestMethods.Ftp.DownloadFile;
request.Credentials = new NetworkCredential(Config.Settings["FTPUser"], Config.Settings["FTPPass"]);
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
string file_contents = System.String.Empty;
using (Stream responseStream = response.GetResponseStream())
{
using (StreamReader reader = new StreamReader(responseStream))
{
try
{
file_contents = reader.ReadToEnd();
}
catch (Exception e)
{
ResultLogger.LogVerbose(trGuid, "Exception while getting file from FTP share.");
ResultLogger.LogVerbose(trGuid, e.Message);
ResultLogger.LogVerbose(trGuid, e.StackTrace);
}
reader.Close();
response.Close();
}
}
return file_contents;
}
Note there are no runtime errors (at least not until some other code attempts to decompress the file), the file is simply being corrupted during the write.
Also, the calling code is below;
string file_contents = GetFile(fileName, inBound);
File.WriteAllText(@".\" + fileName, file_contents);
return new FileInfo(@".\" + fileName);