Audience Dialogue

Do educational programs educate?

Case study 15

This is an account of surveys that didn't work well - an example of what NOT to do. When I was working with the Education Programs department of Radio New Zealand, we decided to try and measure how much listeners had learned from a short series on alternative energy sources.

We already had a panel of listeners to that network. Before the series began, we sent half of them a questionnaire asking them to give what they thought were the correct answers to 20 questions on alternative energy. We begged them not to cheat by looking up encyclopedias etc, but the material in the programs was mostly new, and would not have been in reference books.

After the series had been broadcast, we sent the same questionnaire to the other half of the panel. The two halves were selected at random, so they should have theoretically have given the same answers.

My idea was that, if the programs worked educationally, the percentage of correct answers should have steadily risen along with the number of programs heard. And to check the results, we could compare the "before" group with those in the "after" group who'd heard none of the programs.

All very neat in theory, but in practice the results were a mess. The number of correct answers did not rise with the number of programs heard - in fact, it went down. And the two groups who'd heard none of the programs did not give the same answers, but very different ones.

Yet most listeners thought they'd learned a lot from the programs!

I realized afterwards that it wasn't because the questions we'd chosen were atypical, but because few people listen to radio in order to learn facts. It seemed (though we were never certain, because the surveys didn't include such questions) that listeners' attitudes to alternative energy sources had become less skeptical after hearing the programs. We'd assumed that people would absorb and remember the facts in the programs - but what in fact they remembered, a few weeks later, was the positive tone and enthusiasm of the speakers.

And after reading more widely in this area, I realized that social experiments of this kind never seem to work. When you split a sample into two equal groups, apply some "treatment" to one group, but do nothing special for the other, it's very rare for any effect of that treatment to be detectable in a follow-up survey. The moral: don't waste your time on such experiments.

To learn more about audience research click on the link here to access our comprehensive book Know your Audience.