0

Assuming the code has no checks for 32/64 process.

I have a pre-compiled serializer dll that reads objects from a stream. The same deserialization dll runs 10x slower when run in 64 bit customer environment but I can't reproduce it on my machine. In 32 bit process it runs fast both in my and the customer machines.

A profiler shows much time spent in GC but the memory snapshots are fine (64 is 1.5-2x bigger than 32 but nothing notable).

What should I look for?

Vlad
  • 3,001
  • 1
  • 22
  • 52
  • _"What should I look for?"_ -- too many different things to make this a good fit for Stack Overflow. That said, you might want to double-check the bitness of the process in your tests. Many people get tripped up by the "Prefer 32-bit" option in the project's settings, causing a process to run as 32-bit when they thought it is 64-bit. (Granted, in your description this doesn't seem like it'd lead to the problem you're asking about...but still, it's always worth going back and double-checking your assumptions; make sure the code is really running exactly as you think it is). – Peter Duniho Jul 09 '17 at 20:34
  • 1
    Related? https://stackoverflow.com/q/4137335/555045 – harold Jul 09 '17 at 20:52
  • @harold, thanks for the link, it really looks similar to my issue – Vlad Jul 09 '17 at 20:55
  • Running Any Cpu option means that is can run on 16 bit, 32 bit, and 64 bit microprocessors. The Micros have pipe-line registers inside which basically can load multiple instruction and perform parallel processing.Compiled for 32 will run faster than compiled for 16 since 32 can load more instructions in parallel, and compiled for 64 will run faster than compiled for 32.The Floating Point Unit (FPU) on 32 bit processor can perform int arithmetic in a single instruction while a 16 bit processor requires multiple instructions.Same for 64 bit with long arithmetic.Any CPU compiles for 16 micro. – jdweng Jul 09 '17 at 23:46

0 Answers0