0

I would like some advice on the best way to transfer custom objects from a "REST" server to a controller. In brief, we are running MVC 4 on a server that handles HttpGet requests. Initially, all responses from the server could be accommodated by serializing our custom classes to JSON and then deserializing them client size. However, now that we need to transfer large data (i.e. images) this approach no longer works.

The followings is condensed versions of what I am currently doing, but I would like some advice on a better way to transfer large files (custom types).

classes:

[Serializable]
public class TradingPost
{
    public int TradingPostId { get; set; }
    public string UserId { get; set; }        
}

[Serializable]
public class TradingPostImage
{
    public int TradingPostImageId { get; set; }
    public string ImageName { get; set; }
    public string ImageData { get; set; }
}

[Serializable]
public class TradingPostWithImages
{
    public TradingPost Post { get; set; }
    public List<TradingPostImage> Images { get; set; }

    public TradingPostWithImages()
    {
        Post = new TradingPost();
        Images = new List<TradingPostImage>();
    }
}

Server side controller:

[HttpGet]
    public ActionResult GetAllTradingPosts()
    {            
        List<TradingPostWithImages> postsAndImages = PostRepository.GetAllPostsAndImages();

        MemoryStream memoryStream = new MemoryStream();
        BinaryFormatter binaryFormatter = new BinaryFormatter();
        binaryFormatter.Serialize(memoryStream, postsAndImages);

        var postsAsByteArray = memoryStream.ToArray();

        return File(postsAsByteArray, "application/octet-stream");              
    }

Client side controller:

public List<TradingPostWithImages> GetItems()
    {
        var dataGatewayPath = GetGateWayPath("GetTradingPosts");

        MemoryStream webData = new MemoryStream();

        var Request = WebRequest.CreateHttp(dataGatewayPath );
        var Response = Request.GetResponse() as HttpWebResponse;

        if(Response != null)
        {
             DataStream = Response.GetResponseStream();

             if(DataStream != null)
             {
                    DataStream.CopyTo(webData);
                    DataStream.Close();
             }

             Response.Close();
        }

        webData.Position = 0;
        BinaryFormatter formatter = new BinaryFormatter();

        var activeItems = (List<TradingPostWithImages>)formatter.Deserialize(webData);

        return activeItems;
    }

This works, but is there a way I can accomplish what I am after without deserializing a stream and then casting it to my custom type? Is there a better, or more efficient way to accomplish my task?

D0ubleGunz
  • 60
  • 7
  • Yeah, just stick to json/xml and have your REST server return an object with the images encoded to base64. And also, don't perform data layer logic at the controller :) – Sinaesthetic Mar 19 '14 at 01:45
  • Or generally what I do in this case is to return a list of objects with image ids on the first pass, then have an endpoint that takes the image id and retrieves the images from the server and loads them in after the fact (e.g. ) where the image action on the content controller requests the image with the id "abcd" from the server. This will give you a faster initial response and allow the images to load asynchronously. It's a little chattier, but better performance on the client side. – Sinaesthetic Mar 19 '14 at 01:48
  • @Sinaesthetic thank you for the feed back. For very large images, how do you contend with `maxJsonLength` problems? Just set the value very high? Oh, and no worries regarding the logic in the controller, this example is a "condensed" version that I thought would be easier to post. – D0ubleGunz Mar 19 '14 at 16:56
  • how large is "very" large? – Sinaesthetic Mar 20 '14 at 17:48
  • @Sinaesthetic If a couple of high resolution images (~15MB) are uploaded/downloaded the encoded base64 string quickly exceeds the default maxJsonLength. I have been told that encoding and transfering an image as Json isn't very efficient, that's why I thought I would try something different. – D0ubleGunz Mar 20 '14 at 22:48
  • Ok, I didn't realize the images would be *that* big. It is probably better than you stick with the stream. But I would still separate the calls so that you get the object data separate from the image data. At least that way you can still be productive while the images are coming in. – Sinaesthetic Mar 20 '14 at 23:40
  • So when you say "no longer works" ... can you elaborate on that? Is it just a bad experience because of application responsiveness due to large files, or are you flat out broken? What's up? I'm assuming it's the former and have posted an answer that I hope can help in some way. – Sinaesthetic Mar 21 '14 at 07:16
  • @Sinaesthetic, I would say the desired functionality was broken. The client could not send or receive very large images to/from the server when we were serializing them as Json encoded base64 strings. There is no restriction on the upload file size, so small image files work just fine, but high resolution photos were causing "Maximum request length exceeded". – D0ubleGunz Mar 21 '14 at 23:21
  • @Sinaesthetic, btw thank you for all your time on this! – D0ubleGunz Mar 21 '14 at 23:24
  • No problem. What version of IIS are you using? In newer versions, there's this other setting that needs to be changed: http://stackoverflow.com/questions/3853767/maximum-request-length-exceeded – Sinaesthetic Mar 22 '14 at 17:20

1 Answers1

1

To directly answer your question based on some of the conversation in the comments, really, the only thing I can suggest is that you take a lazy load type approach. Typically, with lazy loading, you request data be loaded as needed, but in this case, all data will be needed right away, but the idea is just the same. So the most efficient way I could suggest is that you load all of the simple data first (basically the data transfers fairly quickly which is pretty much everything but your images in this case). I don't think you're going to get any more efficient than a byte stream over http (maybe try UDP??), so performance can be tuned via user perception.

If your view layer is a web page, then you can just render all of the basic object data first, by referencing image IDs in the thin object on the first call (rather than sending a fat object complete with image data), then use those IDs for image tags that then call a content action which will retrieve the image data from the server asynchronously using that id. This way, your page is usable even while the images load. Like so:

enter image description here

If your view layer is something like a winforms app, you can do roughly the same thing. The idea being that the simple data is retrieved and displayed immediately while images are loaded on a different thread(s), keeping your UI responsive. Just display some sort of loading indicator where the images are about to display.

IMO, when there's large file transfers happening over HTTP, there's only so much you can do to be [creatively] efficient. It's more about dealing with interrupted connections with retry logic (resume download, via byte offset, etc) and keeping the UI responsive than anything else. As long as visible progress is being made without the user feeling like the application is locked up, the negative experience caused by those lengthy load times should be mitigated.

So beyond this messaging strategy, I'd just look into techniques for streaming. This article may help: WebAPI Request Streaming support

Community
  • 1
  • 1
Sinaesthetic
  • 11,426
  • 28
  • 107
  • 176