I'm currently migration a script from Perl to Python3 (3.6.5). Is is running on Windows Server 2016. The Script builds a command line with arguments and executes the created string with subprocess.check_output
. One of the argument option is called -location:"my street"
. The location can contain special chars like umlaut (äöß) or (áŠ).
When I run the Perl script the special chars are passed correctly to the application. When I run the Python script the special chars are replaced by question marks in the application. I think the called application needs a UTF-8 encoded argument string.
The Perl script runs in UTF-8 mode
use UTF8;
binmode( STDOUT, ":utf-8" );
The Python script is created with PyCharm, UTF-8 encoded and the first line of the script contains
# -*- coding: utf-8 -*-
I tried several things to set encoding to UTF-8 for the subprocess arguments, but it didn't work. I used procmon.exe
to compare the application call between the Perl and Python script. What I can see is that the command line that is displayed for Python subprocess call in procmon is readable for me. The working Perl call not. The location string looks for the perl script in procmon looks like this:
-location:"HQ/äöööStraße"
.
The Perl code looks like this:
$command = "C:\\PROGRAM FILES\\Application\\bin\\cfg.exe"
$operand = "-modify -location:123á456ß99"
$result = `$command $operand`;
The Python code looks like this:
# -*- coding: utf-8 -*-
import subprocess
result = subprocess.check_output(['C:\\PROGRAM FILES\\Application\\bin\\cfg.exe', "-modify", "-location:123á456ß99"], shell=False, stderr=subprocess.STDOUT)
Any idea what I have to do that the python arguments are passed correctly to the application?