0

I am looking to create intellisense for a series of fields in a .net web form that will enable a user to click one of the options, which will in turn fill out a series of fields (name, address etc). This was pretty trivial to implement using c#.net and ajax. However, recently I was forced to encrypt some of the key fields that is being searched on against the database, so the feature is now broken. Our only method of decrypting these searchable fields is currently in the C#.net.

To fix the feature, I was thinking of either: a) when the page loads, grab all of the records and store in an array in memory (unencrypted) and as the user keys in the search field use linq or lambda to grab the record(s) of interest. b) when the page loads, store all of the records in a js array (unencrypted) and perform the search client side.

Which would be the best route to go with performance considerations for both my web server vs the client's browsing experience?

I'm thinking this would be < 100k records.

jreed350z
  • 485
  • 1
  • 5
  • 15
  • Couldn't you encrypt the corresponding keys before searching? ie. instead of searching for `name='smith'` you could search for `name='02h37xsdfh5h5h23h34h'` or whatever it looks like in the db – Daniel Jul 30 '12 at 14:07
  • They might work for exact matches, but say someone type in 'smi'. – jreed350z Jul 30 '12 at 14:23
  • Ok, i assumed it was query by full key, my bad – Daniel Jul 30 '12 at 19:20

3 Answers3

1

Why not store the records in the server cache and make an ajax request. Doing this will make the record set available to all the users.

dittu
  • 78
  • 1
  • 6
1

Wow so this is encrypted dynamic data.

Sending a lot of records to the client on each page load would be slow.

You might go with a dynamic cache where you load the data every few seconds.

paparazzo
  • 44,497
  • 23
  • 105
  • 176
1

How large is each row? ~100k rows can be a lot of data.

On the Client

100k on the client is a lot of data, even if it's a couple columns of data. That's just the data, then there is the matter of searching it. It's an issue on a desktop browser. It will kill a mobile browser.

On the Server

This is more plausible.

Things to consider:

  • How much memory will loading all the results into memory consume?
  • What's the latency(network, database) to load all the records into memory? Maybe it can be done async to lessen the impact to the user.
  • Searching in C# can be fast especially when using dictionaries, it is not as fast as a database. Have you considered only storing the data that needs decrypting in memory and leave the rest in the database. Fetch the rest of the data from the DB when needed?

Ajax is still possible (assuming the data is stored on the server). You'll have to route the search requests to the proper store (in memory, database or a mixture of both).

Chuck Conway
  • 16,287
  • 11
  • 58
  • 101
  • I like your idea of only storing the data that needs to be decrypted, and having an ID column to point to the record(s) in question. I'll do some testing to see what type of latency and memory will be consumed and then I suppose multiply that by avg daily usage. I'm also very curious about using a service to load these records into a shared cache. I think even if the records were only refreshed 1 or 2 times a day, that it may suffice. – jreed350z Jul 30 '12 at 15:46
  • Of course I suppose I'll need to check with the business side to verify how frequent the refreshes need to be. – jreed350z Jul 30 '12 at 15:48