After some searching and checking previous answers like here Passing objects from python to powershell, apparently the best way to send objects from a Python script to PowerShell script or command is going to be as JSON.
However, with something like this (dir_json.py
):
from json import dumps
from pathlib import Path
for fn in Path('.').glob('**/*'):
print(dumps({'name': str(fn)}))
You can do this:
python .\dir_json.py | ConvertFrom-JSON
And the result is OK, but the problem I'm hoping to solve is that ConvertFrom-JSON
seems to wait until the script has completed before reading any of the JSON, even though the invidual JSON objects end on each line. This can easily be verified by adding a line like time.sleep(1)
after the print.
Is there a better way to send objects from Python to PowerShell than using JSON objects? And is there a way to actually stream them as they are written, instead of passing the entire output of the Python script after the script completes?
I ran into jq
, which was recommended by "people on the internet" as a solution to my type of problem, stating that ConvertFrom-JSON
doesn't allow streaming, but jq
does. However, this did nothing to improve my situation:
python .\dir_json_slow.py | jq -cn --stream 'fromstream(1|truncate_stream(inputs))' | ConvertFrom-JSON
To make jq
play nice, I did change the script to write a list of objects instead of separate objects:
from sys import stdout
from time import sleep
from json import dumps
from pathlib import Path
first = True
stdout.write('[\n')
for fn in Path('.').glob('**/*'):
if first:
stdout.write(dumps({'name': str(fn)}))
first = False
else:
stdout.write(',\n'+dumps({'name': str(fn)}))
stdout.flush()
sleep(.1)
stdout.write('\n]')
(note that the problem isn't ConvertFrom-JSON
holding things up at the end, jq
itself only starts writing output once the Python script completes)