Another question: why do you think, 99% CPU consumption is good? If you
have an IO intensive task which needs much cpu, there might be a problem!
The goal is, to make IO as fast as possible. In that case, the cpu usage
should be low. Not 0, but like you said, 5-10% should be ok. If it is 99%,
you might have a problem with some calculations, branches, loops. Maybe
there is potential for optimizing. Maybe not, if the logic is already
If you were on a zOS mainframe, you'd detect a serious problem for a batch
job, if it is using much CPU and makes no IO. That is a critical loop
situation and the job must be stopped.
On Win you can use the task-manager. If you see the IO rate high and cpu
low, that is good for mass-data.
On Tue, 10 Jul 2007 18:10:27 +1000, David Johnson <d@DKVJ.BIZ> wrote:
>SAS uses as much cpu as it needs to perform a task. Usually it consumes
>quite low amounts of cpu because the machine is engaged in the business of
>I/O and everything else waits on the data coming off the disk.
>I can sometimes get high cpu usage in a SAS session, but generally only in
>You can look at the cpu rate yourself by step if you include the statement
>Options FullSTimer; which will include details on run time, CPU time and
>memory usage for each step.
>From: SAS(r) Discussion [mailto:SAS-L@LISTSERV.UGA.EDU]On Behalf Of
>Sent: Monday, 9 July 2007 11:59 PM
>Subject: varying SAS cpu usage
>I frequently run code written in SAS (under win XP, both on Pc's and
>servers) that processes large amounts of data sets (more than 10
>billions of observations). Throughout the runtime of the SAS job the
>cpu usage of SAS.exe ranges from 5% to 99%. Does anybody knows if this
>is normal (i.e. is this the default behaviour of SAS) ? If yes, is
>there any way to force SAS to use 99% of the cpu ?
>Thanks in advance