Using asyncio for this is a good possibility I was not aware of.
My best try with asyncio was:
import asyncio
async def run_command():
# Create subprocess
process = await asyncio.create_subprocess_exec(
'./test.sh',
stdout=asyncio.subprocess.PIPE, # Redirect stdout to a pipe
stderr=asyncio.subprocess.PIPE # Redirect stderr to a pipe
)
# Read stdout and stderr asynchronously
captured_output = b''
async for line in process.stdout:
print(line.decode().strip())
captured_output += line
async for line in process.stderr:
print(line.decode().strip())
captured_output += line
await process.wait()
print(captured_output)
# Run the asyncio event loopoutput again.
asyncio.run(run_command())
########################################
This fulfills all my requirements. A nice to have would that the captured_output has not to be constructed with += 's but with a final seek(0) and read() of process.stdout. But I didn't find anything how to rewind the stream, that i can read the whole
Another question is, if this solution is deadlock proof.
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 300 |
Nodes: | 16 (2 / 14) |
Uptime: | 75:03:38 |
Calls: | 6,716 |
Calls today: | 4 |
Files: | 12,246 |
Messages: | 5,357,385 |