Comment by defrost

Comment by defrost 6 days ago

3 replies

Having a number of running processes take the CPU usage to 100% is one thing, have an under utilised CPU with almost no processes running report that usage is at 100% is another thing, the subject of the article here.

rbanffy 6 days ago

I didn't intend this as an example of the issue the article mentions (a misreporting of usage because of a hardware design issue). It was just a fun example of how different hardware behaves differently.

One can also say Omegamon (or whatever tool) was misreporting, because it didn't account for the processor time of the various supporting systems that dealt with peripheral operations. After all, they also paid for the disk controllers, disks, tape drives, terminal controllers and so on, so they could want to drive those to close to 100% as well.

  • defrost 6 days ago

    Sure, no drama - I came across as a little dry and clipped as I was clarifying on the fly as it were.

    I had my time squeezing the last cycle possible from a Cyber 205 waaaay back in the day.

datadrivenangel 6 days ago

Some mainframes have the ability to lock clock speed and always run at exactly 100%, so you can often have hard guarantees about program latency and performance.