1

I have a Silverlight application in which I call my WCF service to get data from the database. If there is a small number of records then it's working fine, but if there are many records then it throws a System.OutOfMemory exception.

I have traced it in a WCF error log file. Are there any ways to compress the data which is coming from WCF to the Silverlight application?

user1796141
  • 115
  • 1
  • 7

3 Answers3

0

U can use IIS dynamic compression for WCF messages. Read next threads/articles:

Enabling dynamic compression

GZip compression with WCF hosted on IIS7

Community
  • 1
  • 1
Dzmitry Martavoi
  • 6,867
  • 6
  • 38
  • 59
  • Thanks for the reply i have configure HTTP Compression in IIS but when i have put var request = HttpWebRequest.Create("http://localhost/RxReport/RXReports.aspx";); request.Headers["Accept"] = "application/json"; request.Headers["Accept-Encoding"] = "gzip, deflate"; in my silverlight application as mention in the article i am getting the following error"Unhandled Error in Silverlight Application This header must be modified using the appropriate property".Can you give me sample example how can i use GZip compression in my application. Thanks – user1796141 Nov 05 '12 at 12:19
0

In your service web config, add this item in the service behavior and endpoint behavior. Then it can transfer data upto 2 gb.

 <dataContractSerializer maxItemsInObjectGraph="2147483647"/>
Sajeetharan
  • 216,225
  • 63
  • 350
  • 396
  • Thanks for the reply but i have already set this property to maximum limit but i am trying to transfer more than 500000 records from Wcf to silverlight. – user1796141 Nov 05 '12 at 09:09
  • 1
    I think this is the maximum amount you can transfer via wcf service. i would suggest you to use paging or threading mechanism to trasnfer data for every 10,000. That's how i did in my project. – Sajeetharan Nov 05 '12 at 09:25
  • Thank you for the reply can you please give me sample in which i can use the gZip compression to transfer data from wcf to silverlight. – user1796141 Nov 06 '12 at 06:09
0

Transferring 500,000 (half a million) records in one go is too large for your system to handle. I'd also say that it was too many for your users to handle.

You should break this down into pages of data and only return a couple of pages at a time. The Silverlight/WCF (RIAServices) DomainDataService can handle all this for you:

<riaControls:DomainDataSource QueryName="GetResults"
                              LoadSize="200"
                              PageSize="100"
                              AutoLoad="True"/>

You add a pager control to your page to move through the pages of data under user control.

This makes your application more responsive as you are only returning a small amount of data each time. Returning 500,000 records in one go will also more than likely cause timeouts for people with slow connections.

I'd also suggest you look into filtering your data so that you only return the data the user is interested in.

ChrisF
  • 134,786
  • 31
  • 255
  • 325