Business academic Pat Brans assesses multitasking and its usefulness.
Jim Whitehurst started his career as a software engineer. Barely over forty years old, he has made a name for himself as somebody who gets things done: he did this as COO of Delta Airlines, and now as CEO of RedHat. Jim thinks people who multitask in an attempt to get more done are misguided. He says you're much more effective if you do one thing at a time.
When I spoke with Jim Whitehurst about time management he told me, "this idea of doing your email while you're on the phone, or using your Blackberry during a meeting is extraordinarily inefficient. I'm all for making use of dead time, such as when you're waiting in line at the airport or exercising in a gym. But there I'm talking about time where your mind is not required to be in two places at once."
"I insist that people close laptops and not use Blackberries during meetings," Jim says. "Because when you get right down to it, you aren't really being that thoughtful on your laptop or your Blackberry and you aren't paying attention to the meeting either. In the end you're doing both badly."
And according to Jim, even when you're on the phone, the other person can get irritated if you're doing something other than participating in the conversation. "You can tell when you are on a call with somebody and they are simultaneously doing something on their computer. They might say something, but it's almost like they're saying something to prove they're listening. They wind up not being very useful on the call. On top of that, I'm sure they aren't doing a very good job with whatever it is they're doing on the computer. It's a waste of time."
The CEO of RedHat summarised his feeling about multitasking. "I think this idea that multitasking saves time is ridiculous."
Let's look at it from another perspective. The term "multitasking" comes from the computer world, and it refers to the situation where several programs are vying for the services of a single CPU. The CPU can only really work on one thing at a time, but it has to fairly distribute its time among the different tasks. There are different strategies for allotting time to each activity, but in all cases overhead is incurred each time a processor switches tasks.
When it's time to have the processor begin work on a new task, the operating system must first store the state of the current task for recall at a future point. Then it must load the state of the new task. Before the CPU can start working on the new program, the operating system must perform this procedure, which is called "context switching". As you may have guessed, context switching is pure overhead, because the processor does no useful work during the switch.
The amount of time allocated for each task to run is called a quantum. If an operating system is configured with a quantum of four milliseconds, and a context switch takes one millisecond, the computer will spend twenty percent of its time context switching. In the extreme case where a computer is spending almost all its time switching and very little time performing any one task, it is said to be "thrashing".
Now consider a case where the quantum is set to ninety-nine milliseconds, and the context switch takes one millisecond. Here the overhead of context switching will be minimal - only one percent. However, when the quantum is set too high, and the computer is servicing multiple users, all of whom request attention at the same time, some users will perceive a delay. People who design and configure operating systems have to make a trade off between optimising overall CPU usage and minimising the delay experienced by users.
A person managing his or her time has to make a similar trade off. He or she may be in a situation in which she has to respond quickly to a lot of different people, but then the overhead of context switching gets out of hand.
One key difference between a person and a computer is the computer can switch context with one hundred percent accuracy, whereas a human being loses information each time he or she stops one tasks to start work on another. So on each switch, a computer loses time, but not information; whereas, a person loses both time and information.
Not surprisingly, this shows up in experiments. For example, a study conducted at the Institute of Psychiatry, University of London by Dr. Glenn Wilson in 2005, found that "Those distracted by incoming email and phone calls saw a 10-point fall in their IQ - more than twice that found in studies of the impact of smoking marijuana."