file handling - Perl STDERR redirect failing -
a common functions script our systems use uses simple stderr redirect in order create user-specific error logs. goes this
# re-route standard out text file close stderr; open stderr, '>>', 'd:/output/logs/stderr_' . &parseusername($env{remote_user}) . '.txt' or die "couldn't redirect stderr: $!"; now, copy-pasted own functions script system-specific error log, , while it'll compile, breaks scripts require it. oddly enough, doesn't print error children script throwing. slightly modified version looks like,
close stderr; open (stderr, '>>', 'err/stderr_spork.txt') or die print "couldn't redirect stderr: $!"; everything compiles fine in command prompt, -c returns ok, , if throw warn function script, , compile, outputs properly. still not understand why though kills children. cut out redirect, , sure enough work. thoughts?
die (and warn) writes stderr. if close stderr , need die attempt reopen it, expect see error message?
since perl, there many ways address issue. here couple.
open file first tmp filehandle, reassign stderr if goes ok
if (open $tmp_fh, '>>', 'd:/output/logs/stderr_' . &parseusername($env{remote_user}) . '.txt') { close stderr; *stderr = *$tmp_fh; } else { die "couldn't redirect stderr: $!"; }use
con. programs run command line, systems have concept of "the current terminal". in unix systems, it's/dev/tty, on windows, it'scon. open output stream terminal pseudo-file.open stderr, '>>', 'd:/output/logs/stderr_' . &parseusername($env{remote_user}) . '.txt' or { open $tty_fh, '>', 'con'; print $tty_fh "couldn't redirect stderr: $!"; exit 1; };
Comments
Post a Comment