0

I am working with 3D numpy arrays containing sparse data that consists of isolated distributions in random locations that are roughly Gaussian in shape (representing particle light intensities in 3D space), but with some variation in width and intensity. I am wondering what method or filter I should use to essentially "normalize" this data- that is, to try and roughly equalize the intensities/amplitudes of these distributions while otherwise largely preserving their widths, such that these isolated distributions end up with roughly the same characteristics.

George
  • 149
  • 10
  • What about https://stackoverflow.com/questions/21030391/how-to-normalize-an-array-in-numpy? – Riccardo Petraglia Oct 29 '20 at 23:12
  • I believe this applies primarily to making a full dataset normalized, rather than distributions within it? I suppose writing something up to box in non-zero data could work, but I was wondering if there was a more elegant solution that would also be more functional with more closely-packed Gaussians – George Oct 29 '20 at 23:16

0 Answers0