Answer
A total of $\frac{5000}{5001}$ of the machine’s time would be spent actually performing processes. However, when a process requests an I/O activity, its time slice is terminated while the controller performs the request. Thus if each process made such a request after only one microsecond of its time slice, the efficiency of the machine would drop to 1/2. That is, the machine would spend as much time performing context switches as it would executing processes.
Work Step by Step
A total of $\frac{5000}{5001}$ of the machine’s time would be spent actually performing processes. However, when a process requests an I/O activity, its time slice is terminated while the controller performs the request. Thus if each process made such a request after only one microsecond of its time slice, the efficiency of the machine would drop to 1/2. That is, the machine would spend as much time performing context switches as it would executing processes.