Lies, Damn Lies and Opinion Polls

Are journalists prone to being dazzled by numbers?

by Alison McCulloch

“Conservative Young Cautious on Sex Education,” claimed a headline in the Sunday Star Times early in January. According to the paper, “sex education could be too much for conservative Kiwi youths” who, it concluded, hold “conservative views on sex issues”. In the face of recurring moral panics about our over-sexualised, over-sexed drunken youth, it was certainly a surprising claim. Could it be that the nation’s teenagers are all ensconced in their bedrooms doing their maths homework after all?

The Sunday Star Times’s article was based on a poll commissioned by the conservative lobby group Family First, whose media release on the same day had pretty much the same take, bearing the headline: “Teens Conservative on Sex/Abortion Issues – Poll”. “A nationwide poll of 600 young people aged 15-21 has found that they hold conservative values on sex issues,” the release began.

A cursory glance at the survey data actually tells quite a different story – different, at least, depending on the story you want them to tell. What’s more, Family First’s own release provided evidence of less – rather than more – sexual conservatism, with its expressions of concern at New Zealand’s high teen pregnancy and abortion rates, and “out of control” sexually transmitted infections. None of which seemed to disturb the Sunday Star Times in its desire to run with a headline that wasn’t just perkily counterintuitive, but also got to include the three-letter “s” word. Nor did any doubts plague The Dominion Post’s columnist Karl du Fresne, who was heartened that the survey showed teenagers “see through all the fraudulent b…..” about sex education .

The “conservative teens” survey was one of around a dozen opinion polls released by Family First in the past year on a range of hot-button moral issues, many of which, just like the youth survey, attracted some surprisingly uncritical attention by the mainstream media. But of course it’s not just Family First surveys that sometimes get an easy ride. Under the “polling” category on their StatsChat blog, faculty from the University of Auckland’s Statistics Department list numerous examples of poor use of polls on issues ranging from smoking to students fleeing overseas to the media’s own “self-selected web site polls”. “We’ve commented before on the annoying tendency of newspapers to claim that self-selected website polls actually mean something,” Prof. Thomas Lumley wrote in a January post. “The media usually refers to the results as coming from an ‘unscientific poll’, but a better term would be ‘a bogus poll’”.

Are journalists really so dazzled by figures that they suspend critical judgment in the face of a percentage or margin of error? Let’s take a closer look at that youth survey. The question that led Family First’s the poll was this: Do you think sex education in schools should teach values, abstinence and consequences such as pregnancy, or just teach safe sex? The results, with a 4% margin of error, showed 34% opted for “values, abstinence and consequences”, 19% for “just…safe sex” while a plurality of 42% rejected the either/or options offered by the pollsters and, unprompted, said they’d like both. (For the questions and full results on this and other Family First polls, look for “Family Issue Polls” under the “Research” tab on Family First’s homepage. [ http://www.familyfirst.org.nz/])

So how did the Sunday Star Times get from those figures to youth conservatism on sex issues? The paper, through its deputy editor, declined to comment, leading to the suspicion it got there the same way Newstalk ZB did – with a little help from Family First. Newstalk also ran the story in at least two bulletins, arguing that the survey showed “more conservative views about sex are developing amongst young people”.

Family First won’t reveal how much it spends on polling, but its director Bob McCoskrie says the organisation generally budgets $1000 a question. With at least a dozen questions publicised in 2011 as well as some internal polling that’s not made public, a tab of $15,000 is undoubtedly conservative. Whatever the total, it’s clearly money well spent, and a drop in Family First’s total budget. Accounts available on the Charities Commission Web site show the organisation’s income last year was a healthy $352,448, with almost $320,000 of that coming from donations.

McCoskrie says Family First is pleased with its opinion polling and plans to continue. “There’s been some polls that we’ve released that the media have completely ignored,” he says. “And yet there’s been others that they’ve loved.” McCoskrie describes it as a bit like fishing. “You can put on really good bait and it’s not taken and other times you can put something out and people love it.” He points to a poll released in March 2011 that asked respondents whether the government should set up an independent complaints authority for Child Youth and Family (CYF). Despite 65% support, the media didn’t touch it. “Then it came up in January, and I re-released the results and said, ‘look we polled people on this issue back in March and here’s what they said, and suddenly the media are interested.”

It’s all about the timing.

And the wording. Just how those questions are constructed is crucial, and a look through the past year’s surveys suggests this is carefully done. Another question asked in the poll of 600 youth asked: “Provided it won’t put the girl in physical danger, should parents be told if their school-age daughter is pregnant and considering getting an abortion?” Although the question doesn’t make clear who “should” be doing the telling (The daughter? The school? The authorities?) and doesn’t talk about the law, Family First’s interpretation was that the 59% who answered “yes” were providing a clear rebuke to politicians who didn’t amend the relevant legislation in 2004 to make such parental notification mandatory. (By way of full disclosure, the author is an active member of the pro-choice group ALRANZ [http://www.alranz.org/], which frequently disagrees with Family First on reproductive rights issues.)

Dr. Peter Thompson, a senior lecturer in the Media Studies Programme at Victoria University, took a look at several of Family First’s polls, including the youth survey. One problem he identified was the condensing of multiple factors together in a single question. “The question about whether sex education classes should include values, abstinence and consequences could be measuring young people’s responses to any one of those factors,” he says. “It’s not really possible to be sure what’s being measured here.” Thompson says the yes/no framing of questions and inclusion of multiple factors limits the validity of inferences you might draw. “It’s not clear whether this stems from poor research design by the market researchers/company or very careful (but biased) research design by the commissioning party.”

So who designs Family First’s questions? McCoskrie says they get tossed around by the organisation in conjunction with an advisor, whom he won’t name but describes as “one of the top guys in the country in this area”, and the polling firm that conducts the surveys, Curia. “We get quite a bit of input from David Farrar – there’s been some changes he’s made that we might not have thought of,” McCoskrie says.

Farrar is Curia’s principal and runs the well-known conservative Kiwiblog. [http://www.kiwiblog.co.nz/ ] In his disclosure statement, Farrar says he’s been a member of the National Party since 1986 and worked under National politicians in Parliament for almost eight years. “My politics are well known, but I try not to mix politics and business,” Farrar says, explaining he’s done polling work for left-wing politicians at the local body level as well as for the National Party. Other clients willing to be publicly identified include John Banks, Independent Liquor, the Wanganui Chronicle, the Northern Advocate and the Republican Movement.

Farrar says there are always multiple ways to write a question and disputes suggestions that Family First’s are designed to get a particular outcome. “[They’re] not designed to get particular results, [they’re] designed to get answers to specific questions, but they’re not the only questions that can be asked.” What’s crucial, Farrar says, is transparency.

“The absolute key is to make sure that full question is known so people can come to their own interpretation,” he says. “You can agree or disagree with what they put out in their press releases, but they are, when you look at other groups out there, meticulous about making sure they always link to the full questions and answers. And I don’t actually think there’s necessarily any other groups that do that.”

Farrar’s right there. Family First provides easy access to the reports issued by Curia, which include the survey questions, results, number and demographics of respondents, among other details. But do journalists ever bother to look? Media coverage of a suite of around nine questions asked last March and released periodically throughout the year would suggest they often don’t. Topics covered in the March survey of 1000 people included abortion, brothels and street prostitution, binding referendums, same-sex adoption, smacking, a commission of inquiry into family violence and child abuse, and the need for a CYF complaints authority.

None of the reporting found by this writer, for example, noted that the March 2011 polls over-represented older age groups, women and those living in rural and provincial areas – something senior research fellow at Victoria University’s School of Government Dr. Jenny Neale pointed out after taking a look at the polls. “If you look at Stats NZ you’ll find that metro – people living in the big cities – makes up 72% of our population and they’ve got 42% coming from metro. And provincial areas should be 14%, they’ve got 27%, while rural – all areas that are not metro or provincial – they’ve got 31% and it should only be 14%,” Neale says, adding that research has shown older people and those living outside metro areas tend to be more conservative.

Farrar acknowledged that March 2011 suite of polls didn’t strictly match the population, and although Curia would usually weight the results to take that into account, it didn’t do so in that case. “I’m not sure why I didn’t weight that one,” Farrar says, “but having says that, let me tell you that weighting doesn’t normally drastically change the results.”

Indeed, the margin between those who agreed with Family First’s position in those polls was probably wide enough to be largely unaffected by weighting, but not in every case. Questions about the definition of marriage (“The law currently defines marriage as being only allowable between a man and a woman. Do you support this?”) and same-sex adoption (“Do you think same sex couples should be allowed to adopt children?”) were relatively close. (On the marriage question 52% said “yes”; 42% “no”; with 6% listed as “unsure/refuse”, while on same-sex adoption, 50% answered “yes”; 40% “no”; with 10% “unsure/refuse”.) This writer couldn’t locate any mainstream media coverage of either of those polls.

Pattrick Smellie, a former corporate communications and brand manager in the energy sector and now co-owner of Business Desk News Services, has seen advocacy polling from both sides of the aisle. “A lot of NGOs do polling in a selective fashion,” he says. “I’m a bit jaundiced, but my general observation is that NGOs get away with murder compared with the corporates.” Journalists, he argues, tend to be more suspicious of polls issued by businesses than those put out by NGOs.

Smellie thinks the kind of selective polling Family First and others carry out undermines their integrity, and he distinguishes good polling from bad by whether the poll is “trying to listen” or “trying to speak”. “What Family First does when it takes a poll is it is trying to speak,” Smellie says. “They are pushing a particular button, which does sell, which plays to that kind of terror of catastrophe … crime and violence, what constitutes ‘appropriate’ family behavior.”

But he also finds fault with the media’s often uncritical use of selective or advocacy polling. They should at least do the basics, Smellie says, giving the question, the sample size, the demographics and distribution of respondents and the source of the poll. “You have to put those things in otherwise you’re reporting a poll that is meaningless.”

Farrar, too, thinks the media could do a much better job of reporting on polls, including linking to the full poll results online, something McCoskrie also suggests. For his part, McCoskrie says Family First is completely up front and appreciates that there’s a lot of scrutiny of polls. “That’s why we specifically ask for 1000 people to be polled to reduce the margin of error, we go to an independent company, we take input from that company as to the wording and we publish the full results so if people can pick it to pieces and say we’ve completely misinterpreted or misrepresented, then we stand accountable to that,” he says. “To date it hasn’t happened and it won’t because they’ve stood the test.”

It’s hard not to agree that Family First is pretty transparent about its polling, or to begrudge the success it’s had with journalists given its sharp, well-organised and committed media relations operation. Ultimately, Farrar is right that the onus is on those who use the polls to do the legwork. But how much can we expect from a 40-second spot in a radio bulletin?

Thompson says public relations firms and lobby groups know very well that journalists are under pressure and not resourced to do their own research. “So it is quite common for ‘information subsidies’ to be provided in the hope that reporters and editors will accept the statistics at face value and not dig much further to check the facts or identify countervailing perspectives.”

Add to that what he calls the “rather unfortunate assumption” that if something is quantified it is scientific and objective. “The statistics being produced by these polls – and many other similar types – are only as meaningful as the questions being asked.”

END

Tags: , , , , , , , , , , , , ,

 

No comments yet.

Write a comment:



Become a Scoop.co.nz Sustaining Subscriber - join the alternative to the mainstream media mind-set!


We are seeking your help to keep the story of Scoop.co.nz going. If you agree to become a Scoop Sustaining Subscriber we are asking you to subscribe to pay $10, $15 (or more if you choose) a month to Scoop. This can be done either via:

Automatic or one-off payments to our bank account:
Westpac - Scoop Media Ltd. 03-0502-0254668-000

Or via paypal using your credit card:

$10 Per Month Sustaining Subscription



$15 Per Month Sustaining Subscription



$25 Per Month Sustaining Subscription



Make A One Off Donation



Instead of spending $10 a month on magazines and newspapers, why not pledge that to sustaining one of the most promising media prospects on the New Zealand media landscape today!
MORE >>

Werewolf Recommends...