On May 5th, CBC/Radio-Canada president and CEO Hubert T. Lacroix spoke at the Canadian Club of Montreal to present CBC’s financial situation. The speech was meant to expose the impact of the federal government’s cuts on the public broadcaster, but especially to discuss the new reality brought on by the arrival of new means of consuming radio and TV content. Since, to borrow the words of its president, CBC/Radio-Canada is “facing a defining moment,” this address also served to launch an online survey: “Transforming CBC/Radio-Canada for the future.”
This desire to consult CBC employees as well as the general population is certainly commendable. However, fundamental methodological principles must be followed for any consultation to be valid and representative of the opinion of those who take part. Unfortunately, after having submitted CBC/Radio-Canada’s consultation to a few tests, we conclude that it presents problems in three major areas: technical and methodological problems with the sampling, a political aim permeating the entire survey, and a multiple answer structure which, along with the wording of the questions, steers respondents towards particular answers.
1. Sampling problems
Online surveys have been sparking much discussion, with questions and critiques mainly directed at how online panels are constituted. Unfortunately, CBC/Radio-Canada’s survey does not meet any of the criteria, however minimal, which allow it to be considered as a scientific poll.
First, the questionnaire is accessible at large. Everyone can answer, exactly like in the unscientific polls conducted by news sites asking their readers “Should we exchange such and such player?” or “Do you prefer milk or cream in your coffee?”. Therefore people might live in a part of the world which cannot access the majority of the public broadcaster’s content, but they can still fill in the survey. This sampling is obviously problematic, since it is in no way representative of the population being surveyed, in contrast with a probability sample.
Obviously, if the survey received answers from every single one of the 25 million Canadians, we’d have a pretty thorough assessment of the issue. However, if the present case, there’s a second problem: we need to make sure that every set of answers comes from a different individual. In the survey’s current form, people can fill in the questionnaire more than once.
Tools exist to tell if a respondent is human or machine. It’s unnecessary to submit everyone completing the survey to a Turing test: there are some simple and accessible ways of checking that are very effective. To paraphrase the end of Lacroix’s speech: “Let’s start right here, take out your smart phones, your tablets, your laptops, go to your desktops and fill in the survey!” How are we to know if certain people weren’t a little too enthusiastic in answering the call, demonstrating their great love and affection for the Crown corporation by taking the survey multiple times? There’s no way of knowing.
Those who created the consultation probably don’t think they’ve created a probability survey: let’s give them the benefit of the doubt until they actually present the results. However, as announced, the methodological problems run deeper than this particular issue. Actually, simply by examining the questionnaire, it’s worth wondering whether there is any need to wait for the answers to be compiled to have a fair enough sense of what the results will be.
2. The survey’s political aim
It would be naïve to believe that a poll can be commissioned without there being something it is meant to find. However, steering away as much as possible from that orientation is part of designing a survey for which the answers gathered represent as much as possible what respondents truly believe.
There was clearly no attempt made to design a survey that could by any stretch be called “objective” in this case, from its very first lines on. Here’s the introduction to the questionnaire:
CBC/Radio-Canada is making a transition to the future. The broadcasting industry is dramatically changing and we must make tough choices to ensure that we are able to seize opportunities and position ourselves to meet the evolving needs of Canadians.
In the questions below, you’ll find real issues facing us as we develop a strategic plan that will take us to 2020 and beyond. This survey was developed to solicit input from Canadians as to the kinds of services they expect from their public broadcaster now and in the future.
Hence, even before the respondent has started answering questions in the survey, he or she already has these two ideas in mind:
– Change is inevitable;
– Tough choices are required to maintain CBC/Radio-Canada.
Both of these political preconceptions will obviously, in the end, guide the answers.
3. Biased question structure
The third element worth noting because it contradicts the do’s and don’ts of designing a proper survey is how the questions were written and worded. They should avoid leading people towards any particular answer.
We’ve just shown that, in the CBC/Radio-Canada survey, there was some effort put into presenting change as inevitable in the introduction. Each of these questions begin with an introductory sentence, constantly reminding us of this “fact.” Here’s one example:
Children are increasingly consuming television content online.
Q6: Looking towards 2020, do you think that our children’s programming should remain on conventional television or be available online only?
– Keep it on the conventional television service.
– Move all children’s programming online.
Since more and more children consume television content online, why should I prevent a child from doing what he or she pleases, or what is most convenient? Why keep on offering an outdated service? Hence, the person filling in the questionnaire is pushed towards choosing the option which suggests moving the entire programming online.
This is not an isolated case. All of the survey’s questions are thus biased to ensure that those answering lean towards new broadcast modes for audiovisual products rather than traditional modes. Let’s be clear, there is nothing abnormal in there being an introduction to each question in a survey. However, this type of presentation should be used to offer information indispensable to answering the question while making sure no answer is being favoured. Reading through the questions, it becomes clear that it will be hard to trust that the results of this consultation will tell us what the people who have answered really think.
Moreover, looking more closely into the number of answer choices offered, the majority of questions only offer two options. In logic, this is called a fallacy of false choice, i.e. the impression of free will is given to the person answering, but there is no middle-ground option offered, nor the possibility of leaving the answer blank. It’s either/or. Dichotomous-choice questions almost always create conditions which which push respondents to fill in the questionnaire in accordance with the orientation privileged by those who designed the survey.
In the end, though the initiative is worthy of applause, this consultation will produce results of questionable validity. We won’t know how many people will have answered, nor if they offer a fair representation the Canadian population, nor even if they live in Canada. We won’t know either if their answers are sincere or whether they are influenced by the political discourse which precedes and permeates the questionnaire. Finally, we won’t know if their answers are not simply reflecting how the questions are designed. In short, despite the good intentions which led to this survey, the actual results will not tell us much about what the population wants CBC to become.
This article was written by Francis Fortier, a researcher with IRIS—a Montreal-based progressive think tank.