|
From: | Christof Warlich |
Subject: | Re: [Help-bash] shell redirection |
Date: | Wed, 19 Oct 2016 20:34:09 +0200 |
User-agent: | Mozilla/5.0 (X11; Linux x86_64; rv:45.0) Gecko/20100101 Thunderbird/45.3.0 |
Am 19.10.2016 um 20:09 schrieb Greg Wooledge:
On Wed, Oct 19, 2016 at 07:28:24PM +0200, Christof Warlich wrote:I want my script(s) to print stderr to the terminal, while both stdout _/and/_ stderr should go to a logfile.Not possible without hacks that will break synchronization between the streams.Any ideas how this could be done, ideally without using any temporary files or named pipes?A temp file probably wouldn't help, because you'd lose *all* sync information. Instead of having stdOut1 stdErr1 O2 E2 you'd end up with O1 O2 O3 E1 E2 E3 Probably not what you want. You could set up two piped readers, one for each stream, and have them both write to your log file (opened in *append* mode by each one). Then the stderr reader (but not the stdout reader) would also write to the terminal. So, two process substitutions (which are roughly equivalent to named pipes), and each one is reading in (presumably) a line-oriented way. That's the best you're likely to get.
Thanks for the quick response: Thus, as much as I understand, a line-oriented piped reader approach would avoid mixing output of stdout and stderr _within_ lines, but the sequence of whole lines w.r.t. stdout and stderr may still be garbeled ...?! That's really rather disappointing: My idea was to make the sequence of commnds executed by my scripts (and possible errors) visible on the terminal by employing set -x, while writing a logfile containing _all_ information for "debugging" purpose if needed. This would be particularly usefull for scripts that produce loads of output, e.g. when generating toolchains. Anyhow, thanks for caring :-) Chris
[Prev in Thread] | Current Thread | [Next in Thread] |