There you are, happily watching cats playing tennis, or whatever the latest viral video is, when it screeches to a halt and the dreaded word "buffering" appears. A buffer is supposed to reduce impact between two things, but "buffering" is the most jarring aspect of streaming media. So why is it called that?
The main reason for the confusion is that buffering isn't really what's going on when the message is displayed. You usually only see the message (and experience the annoyance of frozen or jerky playback) when buffering has broken down.
The internet is a messy environment and does not guarantee that data will always arrive in a timely fashion or even in order. To counter this programs that play streaming media queue up a few seconds or more worth of data before starting playback. That way if there is a problem and something doesn't turn up as expected those few seconds of stored data can provide a breathing space while things get sorted out. As data comes over the network it goes on the end of the queue and data at the front of the queue is decoded and played back to the user. This method provides a buffer - in the everyday sense - between the distant source of the streaming media and the local playback software, reducing the impact of any problems with the flow between them.
Buffering is an extremely common technique in computer programming and telecommunications. It comes up anywhere that data has to be passed from one place to another.
But you can't plan for everything and sometimes there is a disruption of such magnitude that the queue of buffered data empties out. There's nothing left to play so the program has no choice but to pause and wait for more data to arrive over the network to refill the queue. And many programs choose to inform the user at this point that they are buffering.
In a way these programs get it backwards. The program is buffering constantly during normal operation and that's what keeps everything running smoothly. But it's only when things go wrong that the message pops up. From point of view of the program it makes some sense, because at that point the program is doing nothing but buffering, waiting for more data to arrive so playback can resume. But from the user's point of view it is confusing: they can't make any sense of it because it doesn't seem to correspond to any kind of buffering they know about. Yet the technical sense of buffering is a lot closer to the everyday sense than most computer jargon.
The buffering of streaming media really lessens the impact of unreliable networks.
It's a shame that - because its name is invoked only when things go wrong - we have all come to despise it. Without buffering there would be a lot more starts and stops whenever you watched something online - and there are already enough jerks on the internet without that.
- The Press