/
March 24, 2015

2015 Democracy 360

  
min. read
Share
Share on Twitter
Share on Facebook
Share on Linkedin
Copy Link

The Report Card

Executive Summary

Samara's Democracy 360, a report card on the state of Canada's democracy, focuses on the complex relationship between citizens and political leadership. With the understanding that democracy is about more than just casting a ballot every four years, any conversation about how decisions are taken on the future of our country needs to consider a more robust definition of "everyday democracy."

Samara's Democracy 360 expands the measurement of democracy and kick-starts a conversation using measurable indicators focused on three areas essential to a healthy democracy: communication, participation and political leadership. That is: talking, acting and leading.

Democracy 360 brings together a number of data sources, such as Samara's public opinion research and website content analyses, as well as publicly available data from other sources, including the House of Commons and Elections Canada. As such, it is designed to be a thorough, yet manageable, look at the health of citizens' relationship with politics, and one that was repeated in 2017 in time for Canada’s 150th birthday.

In an effort to set a benchmark that prompts reflection and discussion, Samara has awarded an overall letter grade as well as a letter grade for each of the three areas, as outlined in this report.

divider-website
Democracy 360 Grade C

The Result

What does C mean? Quite simply our democracy is not doing as well as a country as rich as Canada deserves. Canadians are not participating in politics as much as they could, they don’t believe it affects them, and they don't see their leaders as influential or efficacious. To turn this situation around, Canada requires more than just higher voter turnout. Canada requires a culture shift towards "everyday democracy," in which citizens feel politics is a way to make change in the country and their voices are heard.

divider-website


What's Inside Samara's Democracy 360?

Canadians don’t trust Members of Parliament or political parties and believe they largely fail to perform their core jobs:

  • Only 40% of Canadians report that they trust MPs to do what is right and only 42% of Canadians place some trust in political parties.
  • Canadians give MPs and political parties failing grades on nearly all their responsibilities, ranging from reaching out to citizens to their work in Parliament. Overall, Canadians feel MPs do a better job representing the views of the party than they do representing their constituents.

Politics is seen as irrelevant and, as a result, Canadians are withdrawing from the democratic system:

  • Only 31% of Canadians believe politics affects them every day.
  • Only 37% give any time or resources to formal political activities between elections.
  • A surprising number (39%) say they haven’t had a single political conversation—online or offline—in a year-long period.
  • With a federal voter turnout of 61% puts Canada in the bottom fifth among democracies, according to the Organisation for Economic Co-operation and Development.

To make politics relevant, Canadians will need to see the value in politics and democracy. This will require the following changes:

  • MPs who serve as reliable, vibrant, two-way links between citizens and government.
  • Citizens who become more politically active at and beyond the ballot box.
  • Political leadership that acts in ways that encourages Canadians’ involvement and demonstrates how politics is a worthwhile way to invest time in order to make a difference.          

Despite an overall unhealthy picture, the Democracy 360 also reveals several positive signs on which to build:  

  • MPs make considerable efforts—through social media, householder mailings and their websites—to reach out to Canadians. With small changes, they can communicate much more effectively.
  • Over half of Canadians petition, donate to charity and volunteer, revealing a desire to connect to causes rooted in and affected by politics.

An election in 2015 presents a real opportunity to build momentum towards a more engaging political culture:

  • Individual volunteers, candidates and parties, as well as community groups, can all take simple steps to change how citizens get involved and demand a more responsive democracy.
  • Under #TalkActLead, anyone can contribute ideas and solutions to improve how politics works. To spur engagement, Samara Canada will be releasing tip sheets and resources as the election approaches.

The Numbers

Samara’s Democracy 360 uses quantifiable indicators to focus on three areas that are essential to a healthy democracy: communication, participation and political leadership.

The indicators measured in this report track Canadian democracy across a wide range of areas, from diversity in the House of Commons to the many ways Canadians can participate in politics to how Members of Parliament and parties function. While not exhaustive, together the indicators paint a rich picture of the way that Canadians talk, act and lead in politics, adding multiple dimensions to voter turnout, the metric most commonly used to measure democracy.The Methodology

What is Samara’s Democracy 360?

The Democracy 360 is Samara Canada’s made-in-Canada report card focused on the relationship between citizens and political leadership. The Democracy 360 combines 23 quantifiable indicators, focused on three areas: communication, participation and political leadership. The Democracy 360 will allow Canadians to compare and assess their democracy over time. Samara plans to revisit the Democracy 360 every two years to measure improvement or decline—the second edition of the report is due out in 2017.


Where Did the Idea for the Democracy 360 Come From?

During Samara’s exit interviews in 2009, a former MP identified a niche: Canadians need a more systematic understanding of how well our democracy is working—one that doesn’t rely on anecdote. The Democracy 360’s conceptual design, data collection and analysis were subsequently conducted over the next few years.

Samara has three main objectives with the Democracy 360:

  1. Engage: Frame and provoke, in increasingly wider circles, a debate about how Canada’s politics can be made more relevant and responsive.
  2. Educate: Prompt Canadians to compare and assess progress on Canada’s democratic vitality with evidence in hand, looking beyond voter turnout.
  3. Enroll: Identify precise areas where change can be advanced, both by Samara and by others.


Why the Name “the Democracy 360”?

The circle echoed by the title’s “360” draws on several themes: to circle back and take stock, to provide a 360-degree scan, and to draw attention to a useful metaphor that highlights the interdependence between  citizens and their elected leaders—the “vicious circle” and the “virtuous circle”—when it comes to building a responsive democracy.

How Did the Research Unfold?

The design, data collection, analysis and scoring process has involved advice from academics, practitioners in the civic and political engagement space, and Samara’s community. A few of the methodological components and findings have also been presented at academic conferences by Samara researchers, and many elements of the Democracy 360 were tested through the research and release of Samara’s Democracy Reports from 2012 to 2014.

Though the relationship between citizens, MPs and political parties has always been at the core of the Democracy 360, how to organize this analysis has generated a great deal of discussion. Initially, the Canadian Democratic Audit series inspired focus on measuring democratic values like representativeness, inclusiveness and participation. Though these values underpin many of the 360 indicators, we decided to organize the Democracy 360 by three areas of activity: communication, participation and leadership.

What Does the Democracy 360 Measure and Not Measure?

Quality: The benefit of the Democracy 360’s evaluation is in its breadth, not the depth of its indicators.  For many of the indicators, for example “householders,” measuring the quality presents a challenge.  While this level of analysis is important, it is beyond the scope of this broader project. Thus, the Democracy 360 focuses on whether the activity did, or did not, happen. This measurement choice was made in order to establish a benchmark.  We encourage future research to probe deeper within each measure (such as quality or frequency).

All democratic players: By focusing on elected political leadership, the Democracy 360 misses the full complexity of democracy, including the work of senators, public servants, political staff, journalists and judiciary—a trade-off made to keep the scope of the Democracy 360 project feasible. Samara also believes that the relationship between the citizen and their representative is at the heart of democracy, which is why other projects, like the Samara MP Exit Interviews have also probed the nature of this representative relationship in Canada.

Municipal and Provincial Political Leadership: The Democracy 360 does not systematically evaluate provincial and municipal political leadership in the same way that it focuses on federal leadership. However, some measures of political participation and Canadians’ views on politics are not specific to the federal arena. Admittedly, this is conceptually less tidy but also reflects the realities of how citizens think about politics as observed in Samara’s focus group research—politics as an activity is not neatly demarcated by three levels of governance for many Canadians.

How Were the Indicators In the Democracy 360 Selected?

With a long list of potential indicators, five criteria were used to select the indicators which measure communication, participation and leadership in Canada:

  1. Accuracy:  Is the measure precise?
  2. Reliability: Is the measure an accurate and consistent capture of the activity?
  3. Feasibility: With respect to finite time and resources, can the data be collected and analyzed?
  4. Replicable: Can the measure be captured again in a similar fashion?
  5. Dynamic: Is the indicator’s change (improvement or decline) measurable?

Where Did the Democracy 360 Data Come From?

The Democracy 360 brings together a number of data sources, including Samara’s public opinion research and website content analyses, as well as data external to Samara, such as from the House of Commons and Elections Canada. The data in the 2014 Democracy 360 dates from 2011 to 2014. There are four general sources of data in report card:

  1. 2014 Samara Citizens’ Survey
  2. Public opinion data in the Democracy 360 was drawn from the Samara Citizens’ Survey, which was conducted in English and French using an online sample of 2406 Canadian residents over 18 years of age living in ten provinces. Data was collected between December 12 and December 31, 2014. The survey has a credibility interval of 1.99 percentage points, 19 times out of 20.
  3. Responses were weighted to ensure they reflect a national representative sample of Canadians, in consideration of gender, region, age as well as whether respondents were born inside or outside of Canada, whether respondents spoke English, French or another language at home, and self-reported voter turnout. Questions that asked about Canadians’ participation were limited to the last 12 months. Data missing at random were imputed using the mi commands in STATA12.
  4. Provincial breakdowns only include statistically significant information (p value <= .10).  
  5. Samara worked with Professors Peter Loewen (University of Toronto) and Daniel Rubenson (Ryerson University) to complete the data collection, cleaning, weighting and imputation.  The survey was conducted by Qualtrics.
  6. Please request the Samara 360 Survey Appendix for precise data manipulation, survey question wording, and unweighted frequencies [info@samaracanada.com].
  7. House of Commons Records
  8. Members’ Expenditures Report (April 1, 2013-March 31, 2014)
  9. The 2013 to 2014 Members’ Expenditures Reports were used to determine the value of the householder indicator.  The reported value reflects the percentage of MPs that spent money on householders during the reporting period. It combines funds spent in Member’s Office Budget and Resources published by the House of Commons to determine the total amount spent on householders. The Government of Canada makes MPs’ Expenditure data publicly available in XML. A Ruby script was used to transform the data to CSV format. The script can be found here.  The total number of MPs included in the analysis was 312.
  10. Diversity Data on Parliament (December 2014)
  11. Demographic information about MPs was compiled using information on parl.gc.ca, including age, gender, place of birth and Indigenous status. Visible minority status is not formally reported by the House of Commons, so was determined using the MPs’ biographical pages on parl.gc.ca. The data was updated in December 2014.
  12. Using the demographic data, Samara created a Proportionality Index score for each group. This score compares how close a group’s representation in the House of Commons matches their proportion in Canada’s population. Canadian population figures were drawn from Statistic Canada’s 2011 Census. A score of 100 equals perfect parity between a group’s presence among MPs in the House and their share of the Canadian population.
  13. Publicly Accessible Data
  14. 2014 Member of Parliament Websites
  15. The Member of Parliament website project analyzed 299 MP’s websites against a 16-point checklist. Data collection occurred May to July 2014 by Samara volunteers and staff. Websites excluded from the analysis included five empty ridings (by-elections were pending at the time of data collection) and the four party leaders.
  16. 2013 Electoral District Association Websites
  17. The EDA Website project analyzed websites of riding associations, the local chapters of our national parties, against a 15-point checklist. In total, Samara researchers searched for 1307 sites (308 for each national party and 75 for the Bloc Québécois). Data collection occurred in August and September 2013 by Samara staff and volunteers.
  18. The analysis was redone in 2014. However, the summer of 2014 marked a time of transition for many EDAs as several riding boundaries shifted to reflect the introduction of 30 new ridings. As a result, Samara’s findings did not present a complete picture of EDA websites in Canada and the Democracy 360 uses the last complete year (2013).
  19. Members of Parliament on Social Media 2013
  20. Full Duplex's analysis in the 2013 report “Peace, Order and Google Government” shares the number of MPs using social media accounts on Twitter, Facebook and YouTube. The data was collected using Sysmos Heartbeat (a media monitoring tool) in November to December 2013.  Subsequent analysis was executed using Heartbeat and Compass.  The full report can be found here.
  21. Elections Canada

Elections Canada’s Estimation of Voter Turnout by Age Group and Gender at the 2011 Federal General Election reports turnout by age, gender and province. Elections Canada relies on their administrative data to generate these figures.  To improve the accuracy of turnout figures, Elections Canada uses the number of Canadian citizens over the voting age as the denominator in calculations rather than the number of registered voters (which recognizes the voter registration list may be incomplete).

Why Create a Composite Score?

Bringing together indicators into an overall composite value, index or “score” is an inherently subjective activity—read Malcolm Gladwell on this point. Inevitably, there are a number of choices to be made; whether certain indicators should be given greater weight because they matter more, or if all indicators can all be treated the same. Seldom is there an obvious or “correct” way to combine such values that leads to complete agreement.

Despite the challenges associated with composite scores, they can still be worthwhile constructs because of the other advantages they offer, such as ease of comprehension and comparison.

Comprehension: With over twenty indicators, each indicator on their own may mean little to the next, or, in other words, it is difficult to see the overall picture with several data points in play. Bringing the values together can assist with this challenge. With that said, the combination can also obscure differences among indicator values (especially large ones), which is why Samara published “The Democracy 360: The Numbers” which shows the values for each of the 23 indicators.

Comparison: A composite score is a powerful tool for comparison. Some indicators may improve or decline in any given year, but these changes provide little insight into the overall picture. A composite score can reveal whether, on the whole, there is improvement or decline over time. As well, when the three areas (communication, participation and political leadership) each have a composition score, it is easier to see which area is faring better than another.

How Was a Composite Score Generated?

Each of the indicators in the Democracy 360 are scored out of 100; the lower the score, the worse the result. While the scale allows for extremes—a perfect score of 100 or zero—it’s unlikely that some indicators will ever reach the upper and lower limits. Nevertheless, the use of a 100-point scale across all indicators provides consistency and reduces subjectivity, compared to creating a uniquely adjusted point-scale sensitive to the unique variation in each indicator.

The structure of the Democracy 360 is nested and hierarchical.  The three areas (communication, participation and leadership), as well as the indicators within each area, were weighted equally. The weight applied to each indicator is dependent on the number of indicators within each area. As for the indicators with sub-indicators, the indicator reflects the average of all the sub-indicators.

The Democracy 360’s weighting scheme is based in conceptual rather than statistical theory—or, in other words, it focuses on the three key areas of democracy that Samara has identified. However, researchers at Samara did conduct a factor analysis of the indicators that rely on survey data alone, since this process required the omission of the three other data sources, the results were revealing rather than directive.

Given the Democracy 360’s conceptual focus, Samara determined that each category would be treated as equally important. This meant the composite score applies an equal weighting scheme to each of the three areas and an equal weighting scheme within each area to each of its indicators.

Scores for each of the three areas (communication, participation and political leadership) are calculated by averaging the scores of the area’s indicators. The three areas (each weighted equally) were added to create a total score out of 100.

Communication + Participation + Leadership = Samara 360

Score out of 33 + Score out of 33 + Score out of 33 = Samara 360 (out of 100)

Communication

  • Number of Indicators: 7
  • Sub-Indicators: 8
  • Weight to Each Indicator: 4.76
  • Sum (out of 33.33): 18.66
  • Percentage: 55.99%

Participation

  • Number of Indicators: 9
  • Sub-Indicators: 17
  • Weight to Each Indicator: 3.7
  • Sum (out of 33.33): 15.53
  • Percentage: 46.59%

Leadership

  • Number of Indicators: 7
  • Sub-Indicators: 8
  • Weight to Each Indicator: 4.76
  • Sum (out of 33.33): 14.1
  • Percentage: 42.29%

Total

  • Number of Indicators: 21
  • Sub-Indicators: 38
  • Weight to Each Indicator: --
  • Sum (out of 33.33): 48.29%
  • Percentage: 48.29%

Why Give a Letter Grade In the Democracy 360?

Letter grades were not part of the original conception of the Democracy 360, but advice from communications experts suggested a report card component would be a valuable tool to help Canadians understand the data. This reflects a trade-off between adding another layer of subjectivity to the data, but also ensuring the research is made accessible and useful to Canadians.

How Was a Letter Grade Assigned?  

Deciding on a letter grade scale was more complicated than simply applying the grading system familiar to most school settings (e.g. A above 80%, F below 50%). Communications advice suggested that an overly negative report card (with several Fs), year over year, would be discouraging and risk greater alienation from the political arena and political participation if Canadians do not see hope for improvement. The advice also suggested the report card would be less effective as a communications tool if the grades were unlikely to change year over year. Given the mix of indicators in the Democracy 360, it is reasonable to anticipate small shifts in improvement or decline, but not large shifts (that is to say 10% points or more) in two years time.

This meant Samara needed a letter grade scale where (1) the value of an F grade would need to be lower than the commonly used 50% threshold and (2) each letter grade should cover a fairly narrow numerical range so that there would be greater sensitivity to changes in the letter grades assigned.

To design this grade system, Samara’s researchers first sought some outside input from a small group of experts, practitioners and engaged citizens in the democratic space to help determine where the Democracy 360 scores are now, in relation to where they could be if Canada’s democracy was stronger in 10 years time. Specifically, they each provided their opinion of what a “great” score for each indicator in roughly 10 years would look like. The group did not have access to the current Democracy 360 values though they were provided some historical and comparative data as reference points.  

What Is a “Good” Grade?

Indicator values provided by the outside consultants were combined using the same weighing scheme for the composite scores. Overall, their input suggested a goalpost for the Democracy 360 of 60%, which is 12 percentage points higher than the 2014 score of 48%.

The goalposts for the three areas (communication, participation and leadership) helped to establish the upper limit for the letter grade scale. For example, the 70% goalpost for communication would correspond with the highest grade on this scale (A+).

Inherently, determining the rest of the scale values was arbitrary—but was driven by a desire to have each letter value be cover about the same numerical range (that is to say two to three points).

Read the full reportRead the full report