• Re: Do subprocess.PIPE and subprocess.STDOUT sametime

    From Horst Koiner@21:1/5 to All on Fri May 12 09:55:17 2023
    Using asyncio for this is a good possibility I was not aware of.

    My best try with asyncio was:
    import asyncio

    async def run_command():
    # Create subprocess
    process = await asyncio.create_subprocess_exec(
    './test.sh',
    stdout=asyncio.subprocess.PIPE, # Redirect stdout to a pipe
    stderr=asyncio.subprocess.PIPE # Redirect stderr to a pipe
    )

    # Read stdout and stderr asynchronously
    captured_output = b''
    async for line in process.stdout:
    print(line.decode().strip())
    captured_output += line
    async for line in process.stderr:
    print(line.decode().strip())
    captured_output += line

    await process.wait()
    print(captured_output)


    # Run the asyncio event loop
    asyncio.run(run_command())
    ########################################

    This fulfills all my requirements. A nice to have would that the captured_output has not to be constructed with += 's but with a final seek(0) and read() of process.stdout. But I didn't find anything how to rewind the stream, that i can read the whole
    output again.
    Another question is, if this solution is deadlock proof.

    Thank you all for the already very valuable input!

    Greetings,
    Horst

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Chris Angelico@21:1/5 to Horst Koiner on Sat May 13 07:41:31 2023
    On Sat, 13 May 2023 at 07:21, Horst Koiner <koinerhorst6@gmail.com> wrote:

    Using asyncio for this is a good possibility I was not aware of.

    My best try with asyncio was:
    import asyncio

    async def run_command():
    # Create subprocess
    process = await asyncio.create_subprocess_exec(
    './test.sh',
    stdout=asyncio.subprocess.PIPE, # Redirect stdout to a pipe
    stderr=asyncio.subprocess.PIPE # Redirect stderr to a pipe
    )

    # Read stdout and stderr asynchronously
    captured_output = b''
    async for line in process.stdout:
    print(line.decode().strip())
    captured_output += line
    async for line in process.stderr:
    print(line.decode().strip())
    captured_output += line

    await process.wait()
    print(captured_output)


    # Run the asyncio event loop
    asyncio.run(run_command())
    ########################################

    This fulfills all my requirements. A nice to have would that the captured_output has not to be constructed with += 's but with a final seek(0) and read() of process.stdout. But I didn't find anything how to rewind the stream, that i can read the whole
    output again.
    Another question is, if this solution is deadlock proof.


    No it's not, but the best part is, it's really close to! Asynchronous
    I/O is perfect for this: you need to wait for any of three events
    (data on stdout, data on stderr, or process termination). So here it
    is as three tasks:

    captured_output = b""
    async def collect_output(stream):
    global captured_output
    async for line in stream:
    print(line.decode().strip())
    captured_output += line

    (You can play around with other ways of scoping this, I'm just using a
    global for simplicity)

    Inside run_command, you can then spawn three independent tasks and
    await them simultaneously. Once all three finish, you have your
    captured output, and the process will have terminated.

    This would then be guaranteed deadlock-proof (and, if you needed to
    feed data on stdin, you could do that with a fourth task and it'd
    still be deadlock-proof, even if it's more than one pipe buffer of
    input), since all the pipes are being managed concurrently.

    Even cooler? You can scale this up to multiple processes by calling
    run_command more than once as separate tasks!

    ChrisA

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)