|Date: ||Wed, 5 Jun 2002 07:29:41 +0100|
|Reply-To: ||"roland.rashleigh-berry" <roland.rashleigh-berry@NTLWORLD.COM>|
|Sender: ||"SAS(r) Discussion" <SAS-L@LISTSERV.UGA.EDU>|
|From: ||"roland.rashleigh-berry" <roland.rashleigh-berry@NTLWORLD.COM>|
|Organization: ||ntlworld News Service|
|Subject: ||Re: automated error checking in logs|
You can pipe it to grep to pick up these errors and warnings and then pipe
it to wc to count the number of lines and then maybe automatically mail the
results to yourself at job end. Then when you look at the logs you could
have a similar utility that shows you the lines rather than counting them.
I would guess that when you type in the "sas" command then it is a script
rather than the SAS binary so maybe you could have another version of this
script to do what you want and call it sasbatch, for example.
"Goldman, Brad AT-Atlanta" <Brad.Goldman@AUTOTRADER.COM> wrote in message
> Hi all,
> I would like to add a code snippet or macro to the end of some of my daily
> batch jobs. It would examine the log file (already copied to an external
> file), and "grep" for certain words, like "error" and "uninitialized".
> looking for the easiest way to do this. My first thought was to pipe the
> results of "grep -is 'error' <file>" to a filename, and read the results,
> doing this for each of the keywords. Are there better ways?
> Note that I don't care about the actual contents of the grep results. I
> don't care what it said, -- I will go back and examine the logs by hand --
> just want to know *if* there was an error, so I can trigger an email to
> P.S. V8.2, unix on solaris, jobs being run via crontab batch jobs