0

I am looking to improve a WCF Client/Server so that it will deal with large numbers of small files, faster than it currently does.

I have written a WCF client and a server to move files across a network.

I have it working by making a call from the client to the server (sending the name of a file that I want to download as a parameter), and then having the server returning a Stream

Simplified Example:

//CLIENT CODE>>
Stream stream = syncService.GetStream(fileName);
//<<CLIENT CODE

//SERVER CODE>>
public Stream GetStream(string fileName)
{
   string filePathOnServer = ServerService.Service1.SERVER_FILES_PATH + fileName;
   return File.OpenRead(filePathOnServer);
}
//<<SERVER CODE

I then call GetStream recursively if I need to get several files, and save the streams to Files on the Client machine. It works acceptably when moving small numbers of large files The issue I have is, the overhead of downloading a single file is about 1/10 of a second, regardless of size; so if I want to download a huge number of 1Kb files, I am essentially capped to a maximum of 10Kbs.

I am hoping that someone has a suggestion for an alternate implementation. I have tried returning a List of Streams from the Server, but I gather that WCF won't allow this.

I need to be able to do this without zipping the files.

I was considering trying to return one stream that was made up of several streams concatenated, but I'm unsure if there is a better approach.

Filburt
  • 17,626
  • 12
  • 64
  • 115
HaemEternal
  • 2,229
  • 6
  • 31
  • 50
  • 1
    Why not pass a list of files as the parameter and return all the files in a single stream. Simply prepend each file with a file length in the stream and parse and split the returned data. – Pete Jan 03 '13 at 14:20
  • @Pete Thanks for the reply. That's what I'm currently considering, but I thought there would maybe be a better option. I'll have a go at coding that while waiting to see if anyone has other suggestions. – HaemEternal Jan 03 '13 at 14:23
  • 1
    You don't specify how your service is hosted, but if it's hosted in IIS, you have to create and open an HTTP connection for each WCF method call. There's a good bit of overhead in that when you're dealing with lots of calls. – Pete Jan 03 '13 at 14:28
  • @Pete Yes, that's correct. I think that's where most of my overhead is, so I definitely need to decrease the number of function calls. – HaemEternal Jan 03 '13 at 14:31
  • 1
    http://stackoverflow.com/questions/276319/create-zip-archive-from-multiple-in-memory-files-in-c-sharp – Nick Bray Jan 03 '13 at 14:33

1 Answers1

3

I'd change your WCF method to accept a collection of file names(i.e. List<string> or string[]) , then pack them up. I know SharpZipLib works well in producing ZIP files.

You stream the ZIP file back to the client, which would in turn unpack it and process the files.

One bigger file should be orders of magnitude faster to transfer and lighter on the bandwidth (because you'd deal with less network-related overhead), not to mention the fact that you would have a single WCF invocation rather than one-per-file (a huge bottleneck).

Alex
  • 23,004
  • 4
  • 39
  • 73
  • thanks for the suggestion. I'm doing my best to avoid zipping the files up unless I absolutely have to; but you are correct, this should get around the issue. – HaemEternal Jan 03 '13 at 14:33
  • One could even disable compression. The ZIP would just be a container format. – usr Jan 03 '13 at 14:47
  • @usr Agreed, but it mostly dependes on what the files are (TXT files would be shrunk a lot, heavily compressed JPEGs not so much) because you might have a faster transfer even if you "waste" some time compressing the data (i.e. zipping takes 3 seconds, but streaming the smaller zip file saves 20 ... it's a win) – Alex Jan 03 '13 at 14:54