Audience Dialogue

Introduction to audience research

The broadcasting industry is unique in having to rely on surveys to find out the size of its audience. Every other industry is either selling something or being visited by clients - so it can add up its sales figures, or count its clients.

But radio and TV stations send their signals out into the air. Unless they do surveys, they have no way of knowing the size of their audiences. That's why they were among the earliest users of surveys. Somewhere I have an example of a postcard questionnaire sent out to radio licence holders in New Zealand in 1932. Among the questions was the classic "Do you dance to broadcast dance music?" It brings a mental picture of a 1930s couple, waltzing around their living room as their huge wooden radio blares scratchy dance music from its great horn-like speaker.

These days, in most advanced countries, TV audiences are measured by meters. Because radios are smaller than TV sets, and more portable, radio meters haven't caught on, so radio audiences are usually measured with listening diaries. TV Meter surveys are complex and expensive. So are diary surveys, if done properly.

Most countries in the developed world have a few companies which specialize in audience measurement. As the market research industry becomes more multinational, a handful of large companies are taking over more and more of the audience measurement - producing figures that show how many people are listening to or watching each station, at each time of day, each day of the week.

This Media Research web page from Victoria University in Australia lists the main companies that specialize in the collection of audience and media research, mostly in Asia and the Pacific region. Among the largest and therefore best known of these companies are:

Most of these have their own web pages. Some of these (e.g. Arbitron) give a few details of their research methods, but mostly these sites give you little idea of exactly how the surveys are done. It requires specialized expertise, quite a lot of money, and a lot of finicky attention to detail.

In concept, audience measurement is simple. But in practice, there are many pitfalls. It's easy enough to produce figures, but to produce demonstrably correct figures is becoming more and more complex. And for potential advertisers on radio and TV, consistency in results is almost more important than accuracy.

A one-off survey, done by an organization not experienced in audience measurement, can't expect to produce accurate audience figures. When people's use of time is being measured, there are many pitfalls.

For example, two recent surveys in Australia estimated the average time people spend listening to radio. One survey settled on 10 minutes a day, the other, 3 hours. Guess which one was funded by the radio industry!

Both surveys were well done, and both were correct. Does this demonstrate, as many people suspect, that "you can prove anything you like with surveys" - or "with statistics"? In fact, that suspicion is unfounded, as long a survey is competently done and accurately covers the population surveyed. The problem is not a statistical one, but a linguistic one. Apparent contradictions such as that above can nearly always be resolved by checking the actual questions asked.

The radio industry's survey, in the above case, defined radio listening as "being in the same room as a radio that is switched on." Obviously that question will produce a high number of listening hours. The other survey was focused not on radio but on how people spend their time. The 10 minute figure applied to radio listening as a primary activity - i.e. when people were doing nothing else but focus on the radio.

So the difference between the two figures (10 minutes and 3 hours) shows that most of the time when people are listening to radio, it is a secondary activity while they do something else.

I don't recommend that stations do their own surveys of audience size, specially if they are inexperienced at doing surveys. Not only is it difficult to get valid results, but when it comes to impressing potential advertisers and funding agencies, a survey done for a station by that station will often be seen as lacking credibility - no matter how well it was done.

Audience surveys - other than measuring audience size

In contrast, surveys about program content are both easier to do than those measuring audience size, and often produce more useful results. Such surveys can include:

There are many more possibilities, but that covers the most common. Check out the case studies on this website, for examples of the above types of survey.

Audience surveys for organizations other than broadcasters

If your organization isn't a radio or TV station, you don't need surveys to measure the size of your audience: you must (in theory) be able to gauge it from sales, or visitors. If you have a web site, a museum, a theatre company, a magazine, or a small business, you can count your customers. What you may not know, though, is how many different customers you have. If 50,000 pairs of feet come through your doors, is that 50,000 people once, or one thousand people 50 times, or what? To find out, you probably need a survey. And if last year's number was 60,000, and before that it was 70,000, you may be in urgent need of a survey.

The internet is a vast new area for audience research: all sorts of things remain to be discovered. Most sites collect logfile data, but don't analyse it in any detail. And though page view statistics are better than nothing, they leave a lot of questions unanswered. User panels, such as those run by Media Metrix, AC Nielsen, and (in Australia) Red Sheriff, provide a little more information, but if web site owners who want more detail will need to organize some real audience research.

Audience research and market research

Market research is mostly concerned with brands, and for most types of product there is usually a small number of brands. To have as many as 50 brands of a physical product is a little unusual; brands are a useful concept when people are making repeated purchases of, say, soap. Most people, if they buy a brand of soap, would expect it to be the same every time. Soap that varied in quality and properties would be a nuisance.

But in the case of media, the arts, and communications in general, the message is expected to be different every time. Not completely different, of course, so brands are still possible - there's an expected range of variation for most radio and TV stations, and stations with too broad a scope of programming usually get very small audiences.

Market researchers who are quite at home with analysing answers to questions such as "Which brands of soap have you bought in the last six months?" can be disconcerted by the answers to questions such as "Which books have you read in the last six months?" The range of answers is often so large that a quite diffent research approach is needed. That's why audience research is different.