0

I am looking to improve the performance of a project that uses WPF.

At the moment it gets a huge chunk of char data (in GBs) from a db and binds it directly to an items control, that is wrapped by a wrap panel. (note I cannot change the fact it gets GBs of data from the DB e.g. page it in)

The performance is...awful. I have looked into various virtualization techniques but to me there seems to be a more simplistic solution.

"Don't add as many controls as there is data." 

My plan is to take just the chunks of the data that would be visible and update a set amount of controls (e.g. a TilePanel of controls that fill the screen with a slight buffer either side). As the user scrolls it would replace the data in the existing controls, instead of adding and removing controls.

Question 1: does this seem like a good idea?

As the data is so large it seems sensible to me to not even have that in RAM but instead dump it in a local file. Then read that file in chunks as and when I need it (possibly via a background paging thread). Would this make a good or bad difference to use-ability?

Question 2: any tips on this, I was planning on taking the basic set up from this question&answer: Seeking and writing files bigger than 2GB in C#

I can't shake this feeling that if this was indeed the right approach, I would have stumbled across it already instead of the seemingly complex virtualization techniques.

Clarification I will most likely have 3 tiers of data, the file (the whole, big data), an array in RAM (partial data, say 10000 elements), and the "visual data" the array that I use to update the controls on screen.

Community
  • 1
  • 1
chrispepper1989
  • 2,100
  • 2
  • 23
  • 48
  • 1
    RAM will be orders of magnitude faster than a file. Don't read/write from the file if possible. Why can't you keep the data in a collection or array, and as they scroll just update the controls on the screen using LINQ .Take() function. Instead of creating a whole bunch of controls – Jon Aug 13 '14 at 21:00
  • *update the controls on the screen* - That's my plan :) *RAM will be faster* - true, I do plan on using an array as a halfway but when we are talking GBs surely filling RAM is a bad idea for every other operation/process? Take looks interesting, but doesnt seem to have a way of setting "start" point, any reason to use this over a standard array that I take a range of elements from? – chrispepper1989 Aug 13 '14 at 21:17
  • Use Skip() and Take() together to page in LINQ. .Skip((page -1) * PageSize).Take(PageSize); And don't worry about the GB of RAM, Windows will handle that with virtual memory. – Jon Aug 13 '14 at 21:53
  • I didnt consider virtual memory, thanks, is windows smart enough to virtualise my large array or does it just start using virtual memory once the ram is full? In which case is that not potentially slower if its putting oftwn used things in the virtual mem? – chrispepper1989 Aug 14 '14 at 06:19

0 Answers0