bash - How can I split and re-join STDOUT from multiple processes? -
I am working on a pipeline that has some branch points that merge later - they do something like this Looks like:
command2 / \ command1 command4 \ command3 each command writes STDOUT and accepts input through STDIN is. Command1 to STDOUT both command2 and command3, which are run sequentially , need to be passed, and their outputs are effectively communicated. Need to go and pass in Command 4. I initially thought that something like this would work:
$ command1 | (Command2; Command3). Command4
This does not work, because only the STDOUT is sent to 4 commands from Command2, and when I remove Command4, it is clear that Command3 In other words, it seems that the command2 is tired or is consuming the stream - not passing the proper stream from Command1 -. I have the same result {command2; Command3; } In the middle then I thought I should use, and tried to do this:
$ command1 | T & gt; (Command2). Command3 | Command4
But surprisingly it also does not work - it appears that Command1 and the command 2 is piped in Command3 , Resulting in errors and Command3 only pipe into command 4. I have found that the following input and output are given from Command2 and Command3:
$ command1 | T & gt; (Command2) & gt; (Command3). Command4
However, the output of Command1 is also lowered with Command4, which produces different specifications compared to Command1 and Command3. is. The solution I received is hack, but it works:
$ command1 | T & gt; (Command2) & gt; (Command3) & gt; / Dev / null | Command4
It outputs command1 to command 4, whereas the command2 collects STDOS with command and command3. It works, but I think I am not getting a more clear solution, am I? I have read dozens of threads and have not solved this problem which works in my use, nor have I seen the exact problem of splitting and re-connecting currents (although I can not be the first one to be a To deal with) Should I just use the named pipes? I tried, but it also had difficulty working, so maybe it is another story for another thread. I am using Bash in RHEL 5.8.
You can play with file descriptors like this;
((date | t> (wc & gt; and 3) | wc) 3 & gt; and 1). WC
or
((command1 | tee> (command2 & gt; and 3) | order3) 3 & gt; and 1 ). Command4
To explain this, it will output the original data to
tee> (wc & amp; 3) stdout, and the internal
Wc results will output to FD3. External 3 & gt; & Amp; 1) Then the FD3 will bring the output back to STDOUT, so the output tailing command from both wc is sent.
However, there is nothing in this pipeline (or one of its own solutions) that will guarantee that production will not be disturbed. The imperfect lines from Command2 will not be mixed with the command 3 - if this is a concern, then you have to do one of the two things;
- Write your own
- Type the file into a file with a command2 and command 3 and use
cat > To merge data as input in command 4
Comments
Post a Comment