I have a Django application which resets unix user passwords running in an Ubuntu machine, but my development environment is OS X and I've come across this annoying situation:
OS X:
>>> import crypt
>>> crypt.crypt('test','$1$VFvON1xK$')
'$1SoNol0Ye6Xk'
Linux:
>>> import crypt
>>> crypt.crypt('test','$1$VFvON1xK$')
'$1$VFvON1xK$SboCDZGBieKF1ns2GBfY50'
From reading the pydoc for crypt
, I saw it uses an OS-specific crypt
implementation, so I also tested the following code in both systems with the same results as Python:
#include <unistd.h>
int main() {
char *des = crypt("test","$1$VFvON1xK$ls4Zz4XTEuVI.1PnYm28.1");
puts(des);
}
How can I have OS X's crypt()
implementation generate the same results as Linux crypt()
?
And why isn't that covered by the Python implementation (as I would expect from such cases for cross-platform deployment)?