Mydemocracy.ca Creator, CBC rolling out Canada 150 survey in early new year

canada-150

 

The creators of the MyDemocracy.ca debacle have been granted $576,500 from the hundreds of millions splashed around by the federal government for the Canada 150 celebration. Vox Pop Labs is creating a survey “to engage Canadians in a process of self-exploration and a reflection of who we are at this particular juncture in our history,” explained founder and chief executive officer Cliff van der Linden.

The Vox Pop grant money is divided for $300,500 in consultancy fees, $175,000 in outreach costs, $63,000 in seminar costs, $24,000 in equipment costs and $14,000 in venue costs .

The grant amount for the early 2017 survey “Project Tessera” is $250,000 more than what the Liberals expensed on this week’s electoral reform survey so scorned by opposition MPs, journalists and academics. Vox Pop stated in its grant application form (obtained via an access to information request) that it “is committed to operating Project Tessera exclusively on a cost recovery basis.”

Linden explained by email why this upcoming project has a much more expensive price tag.

“Project Tessera is a much larger undertaking than MyDemocracy.ca. It involves national focus groups, a respondent panel five times the size of that used for MyDemocracy.ca, a working conference bringing together leading scholars in the field, and a more complex design and technical architecture.”

The two-day conference taking place this month includes rooms, food, hotels, and flights for around 30 academics so they can get together to “hone” the survey tool for the Canada 150 initiative. One of Vox Pop’s academic partnerships is with the McGill Institute for the Study of Canada. The advisory panel will develop “a distillation of the archetypes for presentation to the public…”

The archetypes used to classify or pigeonhole (depending on who you ask) participants in Vox Pop’s MyDemocracy.ca survey were roundly mocked by the public on Twitter this week, where users used the magic of Photoshop to create their own personalized and hilarious archetypes. On Thursday The National Post’s Tristin Hopper created a parody survey called DemocracyMine.ca, in which unfortunate participants were given off-the-wall questions like “There should not be anything in the rules that says a dog cannot be elected to Parliament and also play professional basketball” and endearing archetypes such as “dinosaur”, “villain”, and “pervert.”

However, the Project Tessera application for the upcoming survey has been given rave reviews in letters of support from political science professors from the University of Toronto and McGill.

For the Canada 150 initiative survey Vox Pop is partnering once again with the CBC. Vox Pop is also the creator of Vote Compass, used in both federal and provincial elections. In the 2011 election, Vote Compass became divisive because some accused the tool of being flawed or skewed, as the needle apparently pointed due Liberal more often than not.

Linden says the accusations “have been entirely debunked, both in the public discourse and academic studies.”

Mario Canseco, vice president of Insight West, in regards to Vox Pop’s recent electoral reform survey believes the segments were not adequately defined. “I was labelled a ‘Pragmatist’, and then they showed all of the things that were the most important for ‘pragmatists’, and I had answered ‘No’ or ‘Opposed’ to all of them. It’s almost like labeling someone a ‘bird’ and then saying: ‘You live in the sea and have gills.’ Unless you have some behavioural questions that go beyond saying ‘Yes’ or ‘No’ to a proposition, you can’t just place people in categories at your leisure. Properly done segmentations work well, particularly in consumer research, because they go beyond asking about a logo or a plan. This is why MyDemocracy.ca reminded me of the Cosmopolitan quizzes of the past,” explained Canseco.

In response to detractors, Linden wrote: “I encourage critics to review the methodology, which is available directly in the application. I think the results will complement existing public consultations related to this initiative.”

In this latest partnership between CBC and Vox Pop, Project Tessera is supposed to, according to a letter of support from CBC Radio Canada, “allow participants to learn more about their own national identities and cultures, and to explore the commonalities they have with other people across the country. This tool would help users learn about themselves and the diversity of identities in Canada.”

Satirists and pranksters may be whetting their appetites for round two, but Project Tessera is touted to be a sophisticated educational tool that aims “to engender among all Canadians an individual sense of belonging within a richly diverse national mosaic.” The tool will survey users on “themes such as culture, values, symbols, belonging, etc.” and “has the potential to generate an unparalleled dataset on public perceptions about Canada and what it means to be Canadian.”

And those concerned with oversimplification a la Mydemocracy.ca need not fret, Project Tesera will use “multiplicity of narratives associated with being Canadian” to make custom characterizations that are “[m]ore than a simple association of user to archetype … each user will be presented with a personalized analysis which simultaneously celebrates their individual identity and inclusion in the broader Canadian collective.”

Every Canadian is a special snowflake after completing Project Tessera.

In the assessment, the Canadian Heritage bureaucrat reviewing Vox Pop was impressed with the company’s “unparalleled levels of online engagement reach” like the over 1.8M Canadians that used the Vote Compass during last federal election. The bureaucrat also was pleased that the digital tool is “particularly well-suited to youth.”

Those eager to test out Project Tessera won’t have to wait long, the online survey should be ready for a spin on CBC’s website by early next year.

Mario Canseco Q&A

1) What are the main problems you see with the government’s electoral reform survey?
The segments are not adequately defined. I was labelled a “Pragmatist”, and then they showed all of the things that were the most important for “pragmatists”, and I had answered “No” or “Opposed” to all of them. It’s almost like labelling someone a “bird” and then saying: “You live in the sea and have gills.” Unless you have some behavioural questions that go beyond saying “Yes” or “No” to a proposition, you can’t just place people in categories at your leisure. Properly done segmentations work well, particularly in consumer research, because they go beyond asking about a logo or a plan. This is why MyDemocracy.ca reminded me of the “Cosmopolitan” quizzes of the past. They would ask:
How do you deal with a break-up?
a) Go for a run. b) Ice cream. c) Step on some ants.
How do you deal with stress?
a) Pushups. b) Chocolate Cake. c) Bite lip until it bleeds.
And so on…
So, if you answered mostly a), you’re a jock. Mostly b) a glutton. Mostly c), a potential serial killer.
Pretty simple, right? Great. You spent 10 minutes on something fun.
Can you follow the same pattern to deal with an issue like electoral reform? Probably not.
Still, the biggest problem I have with the survey is the introduction:
This is utterly disgraceful. No survey, online or by phone, should introduce a topic by saying the survey taker “hopes that you learn something”. Survey takers learn from the audience they are asking questions to (provided they have the right set of tools to analyze the data). What are we, as survey takers, supposed to be learning from MyDemocracy.ca? Using this kind of language is extremely condescending and insulting. I don’t think any person who has conducted polls in a professional manner would endorse an introduction like this one.
2) Do you think the mydemocracy.ca survey is scientific or not? Why?
It depends on what the definition of “scientific” is. There are ways to generate meaningful data from a survey with an open-link and apply weights to it based on census targets. Still, there will be purists who dislike online data collection because the panels are allegedly self-selected, and who say that only the phone can be used because everyone needs to have an equal chance to take a poll (even in a world where fewer people have landlines). In this case, the value of the survey would depend on two issues: proper representation of the population (you can’t have 10,000 respondents, and just three from Quebec, for instance) and assurance that the people who took the poll are who they say they are. This second issue is crucial. On online panels (like the one we rely on at Insights West), we verify that the people who take the survey are who they say they are. Can MyDemocracy.ca assure us that a respondent who claimed to be Female, Boomer and Ontarian is not actually a 17-year-old boy from Tennessee having fun online?

 

3) Can you briefly explain the Myers-Briggs test and if it is a scientific test?
The test has been used as a basis for psychological assessments for decades. But, as is the case with any other application of a theory, it can fail. Trying to justify a segmentation because you based it on Myers-Briggs is not enough. You can use a calculator and still make mistakes.
4) Does using 35,000 respondents powering cluster analysis to determine archetypes a good way to conduct a scientific poll?
Probably not. The essence of proper polling is sample selection. Unfortunately, some reporters and editors have fallen into the trap of assuming that the more people take a survey, the more accurate it will be. This is just not true. You don’t need to talk to 10 million people. You need to talk to the right set of people. Case in point. We had the best online prediction of the popular vote in the United States last month, with a sample of 865 decided voters. We were closer than SurveyMonkey, which had a sample of 70,000 Americans. Sample size means little, certainly less than sample selection.
5) Are you familiar with Vote Compass? If so, what are your thoughts on it?

I am familiar with Voter Compass. It was a fun exercise for a rainy afternoon. Nothing more. It does not supplant properly conducted polling.

Clifton van der Linden Q&A

What is your response to the criticism coming from academics, politicians, and social media that your electoral reform survey is unscientific and biased towards the Liberal government’s desired outcomes? Do you think the results of the survey will be able to inform the government on how to move forward with electoral reform?

I encourage critics to review the methodology, which is available directly in the application. I think the results will complement existing public consultations related to this initiative.

What are Canadians supposed to learn from being classified as an archetype?

The archetypes are empirically derived from a cluster analysis run on panel data comprised of more than 3,000 randomly selected Canadians. The purpose is to provide users with an engaging entry-point into the conversation about democratic values. For those who have not been particularly engaged in the conversation to date, it is an attempt to help them situate themselves within the discussion about how Parliament works. It’s also a way to encourage sharing across social media and hopefully encourage more Canadians to participate in the exercise and the broader discussion attached to it.

Did Pop Vox Labs come up with the questions for the electoral reform survey independently, or did the government give input and specify what questions it wanted asked?

The survey was developed in collaboration with an academic advisory panel and with input from the government.

In the past, your Vote Compass has been criticized for skewing Liberal, with people like Prof. Brock claiming she completed the quiz several different ways and always ended up with a Liberal result. Would you say that quiz was biased towards the Liberals?

The accusations about bias in Vote Compass have been entirely debunked, both in the public discourse and academic studies. There is more than ample evidence available to substantiate the outright dismissal of such claims.

Canadian Heritage granted you $576,500 for this upcoming project, and in your application you stated “Vox Pop Labs is committed to operating Project Tessera exclusively on a cost recovery basis.” Yet the last survey cost the government $326,500, can you explain why the new project that Vox Pop is doing on strictly a cost recovery basis costs $250,000 more than the last survey, especially since this upcoming project has the support of partners like the CBC?

Project Tessera is a much larger undertaking than MyDemocracy.ca. It involves national focus groups, a respondent panel five times the size of that used for MyDemocracy.ca, a working conference bringing together leading scholars in the field, and a more complex design and technical architecture.

What can Canadians expect to learn from Project Tessera?

Details about Project Tessera are still forthcoming, but the intent is to engage Canadians in a process of self-exploration and a reflection of who we are at this particular juncture in our history.

Final Notes: There is another interesting Q&A with Cliff van der Linden done by Maclean’s here.

I will be launching a crowdfunding account for this blog in the near future, but for now if you’d like to support my work please go to Loonie Politics and become a member. A subscription costs $5 a month, but with the promo code Gordon the yearly membership only costs $40, and I get a portion of the proceeds. On top of getting original content by me, you’ll also get pieces from veteran columnists Warren Kinsella and Michael Taube, as well as other up-and-comers like J. J. McCullough, editorial cartoons by Jeff Burney, and podcasts on the latest in politics. Click here to subscribe http://looniepolitics.com/register/.