Catching up: civic tech research, crisis of participation in Brazil, podcasts and more

34603261201_45067de265

picture by tollwerk on flickr

The dream consultancy

The Hewlett Foundation is seeking consultants to help design a potential, longer-term research collaborative to study the application of behavioral insights to nudge governments to respond to citizen feedback. This is just fantastic and deserves a blog post of its own. Hopefully I will be able to do that before the EOI period ends.

Rise and fall of participatory democracy in Brazil?

In an excellent article for Open Democracy, Thamy Pogrebinschi and Talita Tanscheit ask what happened to citizen participation in Brazil. The authors note that “The two main pillars on which institutional innovations in Brazil had been erected – extensive institutionalization and a strong civil society – have not been enough to prevent a functioning system of social participation being torn to shreds in little more than a year.”

I have been asked for my take on the issue more than once. Personally, I am not surprised, despite all the institutionalization and the strength of civil society. Given the current Brazilian context, I would be surprised if the participatory spaces the article examines (councils and conferences) remained unaffected.

Playing the devil’s advocate, this period of crisis may also be an opportunity to reflect on how policy councils and conferences could innovate themselves. While they extremely important, one hypothesis is that these structures failed to appropriately channel societal concerns and demands that later exploded into a political crisis, leading to the current situation.

Provocations aside, it is just too early to tell whether this is the definitive death of conferences and councils. And my sense is that their future will be contingent upon two key points: i) the direction that Brazilian politics take following the 2018 general election (e.g. progressive x populist/authoritarian), ii) the extent to which councils and conferences can adapt to the growing disintermediation in activism that we observe today.

The Business Model of Civic Tech?

If you are working in the civic tech space, you probably came across a new report commissioned by the Knight Foundation and Rita Allen Foundation, “Scaling Civic Tech: Paths to a Sustainable Future.” As highlighted by Christopher Wilson at the Methodical Snark, while not much in the report is surprising for civic technologists, it does provide the reader with a good understanding of the expectations of funders on the issue of financial sustainability.

When thinking about business models of civic tech efforts, I wonder how much money and energy were devoted to having governments open up their datasets while neglecting the issue of how these governments procure technology. If 10% of those efforts had been dedicated to reforming the way governments procure technology, many of those in the civic tech space would now be less dependent on foundations’ grants (or insights on business models).

Having said this, I am a bit bothered by the debate of business models when it comes to democratic goods. After all, what would happen to elections if they depended on business models (or multiple rounds of foundations’ grants)?

Walking the talk: participatory grant making?

A new report commissioned by the Ford Foundation examines whether the time has come for participatory grant making. The report, authored by Cynthia Gibson, explores the potential use of participatory approaches by foundations, and offers a “starter” framework to inform the dialogue on the subject.

Well-informed by the literature on participatory and deliberative democracy, the report also touches upon the key question of whether philanthropic institutions, given their tax benefits, owe the public a voice in decisions they make. If you are not convinced, this Econtalk podcast with Bob Reich (Stanford) on foundations and philanthropy is rather instructive. There is also a great anecdote in the podcast that illustrates the point for public voice, as described by Reich:

“So, in the final days of creating the Open Society Institute and associated foundations, there was disagreement amongst the staff that Soros had hired about exactly what their program areas, or areas of focus would be. And, to resolve a disagreement, Soros allegedly slammed his fist on the table and said, ‘Well, at the end of the day, it’s my money. We’re going to do it my way.’ And a program officer that he’d hired said, ‘Well, actually Mr. Soros, about 30% or 40% of it would have been the taxpayer’s money. So, I think some other people actually have a say in what you do, here, too.’ And he was fired the next week.”

Democracy podcasts real-democracy-now-logo-jpg

Talking about podcasts, the Real Democracy Now Podcast is fantastic. It is definitely one of the best things out there for practitioners and scholars working with citizen engagement.

Although broader in terms of the subjects covered, Talking Politics by David Runciman and Catherine Carr is another great option.

Other tips are more than welcome!

And this is brilliant…

(via @oso)

Other interesting stuff you may have missed

Study analyzing Pew survey data suggests a “gateway effect” where slacktivism by the politically uninterested may lead to greater political activity offline

Seeing the World Through the Other’s Eye: An Online Intervention Reducing Ethnic Prejudice

Smartphone monitoring streamlined information flows and improved inspection rates at public clinics across Punjab (ht @coscrovedent)

The Unintended Effects of Bottom-Up Accountability: Evidence from a Field Experiment in Peru

Literature review: does public reporting in the health sector influence quality, patient and provider’s perspective?

Techniques and Technologies for Mobilizing Citizens: Do They Work?

17146073016_5657607917_z1

Techniques for mobilizing citizens to vote in elections have become highly sophisticated in large part thanks to get-out-the-vote (GOTV) research, with fascinating experimental evidence from interventions to increase turnout. Until recently, the use of these techniques has been mostly limited to electoral processes, often resorting to resource intensive tactics such as door-to-door canvassing and telemarketing campaigns. But as digital technologies such as email and SMS lower the costs for targeting and contacting individuals, the adaptation of these practices to participatory processes is becoming increasingly common. This leads to the question: how effective are GOTV-type efforts when using technology outside of the electoral realm?

One of the first (if not the first) efforts to bring together technology and GOTV techniques to non-electoral processes took place in the participatory budgeting (PB) process of the municipality of Ipatinga in Brazil in 2005. An automated system of targeted phone calls to landlines was deployed, with a recorded voice message from the mayor informing residents of the time and the location of the next PB meeting closest to them. Fast forward over a decade later, and New Yorkers can receive personalized text messages on their phones indicating the nearest PB polling location. Rather than a mere coincidence, the New York case illustrates a growing trend in participatory initiatives that – consciously or not – combine technology with traditional GOTV techniques to mobilize participation.

However, and unlike GOTV in elections, little is known about the effects of these efforts in participatory processes, the reasons for which I briefly speculated about in a previous post. We have just published a study in the British Journal of Political Science that, we hope, starts to reduce this gap between practice and knowledge. Entitled “A Get-Out-the-Vote Experiment on the World’s Largest Participatory Budgeting Vote in Brazil”, the study is co-authored by Jonathan Mellon, Fredrik M. Sjoberg and myself. The experiment was conducted in close collaboration with Rio Grande do Sul’s State Government (Brazil), which holds the world’s largest participatory budgeting process.

In the experiment, over 43,000 citizens were randomly assigned to receive email and text messages encouraging them to take part in the PB voting process. We used voting records to assess the impact of these messages on turnout and support for public investments. The turnout effect we document in the study is substantially larger than what has been found in most previous GOTV studies, and particularly those focusing on the effect of technologies like email and SMS. The increase in participation, however, did not alter which projects were selected through the PB vote: voters in the control and treatment groups shared the same preferences. In the study, we also assessed whether different message framing (e.g. intrinsic versus extrinsic) mattered. Not that much, we found, and a lottery incentive treatment had the opposite effect to what many might expect. Overall, our experiment suggests that tech-enabled GOTV approaches in participatory processes are rather promising if increasing levels of participation is one of the goals. But the “more research is needed” disclaimer, as usual, applies.

You can find the final study (gated version) here, and the pre-published (open) version here.

 

New Papers Published: FixMyStreet and the World’s Largest Participatory Budgeting

2016_7_5_anderson-lopes_consulta-popular_virtual

Voting in Rio Grande do Sul’s Participatory Budgeting  (picture by Anderson Lopes)

Here are two new published papers that my colleagues Jon Mellon, Fredrik Sjoberg and myself have been working on.

The first, The Effect of Bureaucratic Responsiveness on Citizen Participation, published in Public Administration Review, is – to our knowledge – the first study to quantitatively assess at the individual level the often-assumed effect of government responsiveness on citizen engagement. It also describes an example of how the data provided through digital platforms may be leveraged to better understand participatory behavior. This is the fruit of a research collaboration with MySociety, to whom we are extremely thankful.

Below is the abstract:

What effect does bureaucratic responsiveness have on citizen participation? Since the 1940s, attitudinal measures of perceived efficacy have been used to explain participation. The authors develop a “calculus of participation” that incorporates objective efficacy—the extent to which an individual’s participation actually has an impact—and test the model against behavioral data from the online application Fix My Street (n = 399,364). A successful first experience using Fix My Street is associated with a 57 percent increase in the probability of an individual submitting a second report, and the experience of bureaucratic responsiveness to the first report submitted has predictive power over all future report submissions. The findings highlight the importance of responsiveness for fostering an active citizenry while demonstrating the value of incidentally collected data to examine participatory behavior at the individual level.

An earlier, ungated version of the paper can be found here.

The second paper, Does Online Voting Change the Outcome? Evidence from a Multi-mode Public Policy Referendum, has just been published in Electoral Studies. In an earlier JITP paper (ungated here) looking at Rio Grande do Sul State’s Participatory Budgeting – the world’s largest – we show that, when compared to offline voting, online voting tends to attract participants who are younger, male, of higher income and educational attainment, and more frequent social media users. Yet, one question remained: does the inclusion of new participants in the process with a different profile change the outcomes of the process (i.e. which projects are selected)? Below is the abstract of the paper.

Do online and offline voters differ in terms of policy preferences? The growth of Internet voting in recent years has opened up new channels of participation. Whether or not political outcomes change as a consequence of new modes of voting is an open question. Here we analyze all the votes cast both offline (n = 5.7 million) and online (n = 1.3 million) and compare the actual vote choices in a public policy referendum, the world’s largest participatory budgeting process, in Rio Grande do Sul in June 2014. In addition to examining aggregate outcomes, we also conducted two surveys to better understand the demographic profiles of who chooses to vote online and offline. We find that policy preferences of online and offline voters are no different, even though our data suggest important demographic differences between offline and online voters.

The extent to which these findings are transferable to other PB processes that combine online and offline voting remains an empirical question. In the meantime, nonetheless, these findings suggest a more nuanced view of the potential effects of digital channels as a supplementary means of engagement in participatory processes. I hope to share an ungated version of the paper in the coming days.

Bit by Bit: Social Research in the Digital Age

bigdata

Pic by Jim Kaskade (flickr creative commons)

Matthew Salganik, Professor of Sociology at Princeton University, has recently put his forthcoming book on social research and big data online for an open review. Matthew is the author of many of my favorite academic works, including this experiment in which he and Duncan Watts test social influence by artificially inverting the popularity of songs in an online music market. He is also the brains behind All Our Ideas, an amazing tool that I have used in much of the work that I have been doing, including “The Governor Asks” in Brazil.

As in the words of Matthew, this is a book “for social scientists that want to do more data science, and it is for data scientists that want to do more social science.” Even though I have not read the entire book, one of the things that has already impressed me is the simplicity with which Matthew explains complex topics, such as human computation, distributed data collection and digital experiments. For each topic, he highlights opportunities and provides experienced advice for those working with big data and social sciences. His stance on social research in the digital age is brilliant and refreshing, and is a wake-up call for lots of people working in that domain. Below is an excerpt from his preface:

From data scientists, I’ve seen two common misunderstandings. The first is thinking that more data automatically solves problems. But, for social research that has not been my experience. In fact, for social research new types of data, as opposed to more of the same data, seems to be most helpful. The second misunderstanding that I’ve seen from data scientists is thinking that social science is just a bunch of fancy-talk wrapped around common sense. Of course, as a social scientist—more specifically as a sociologist—I don’t agree with that; I think that social science has a lot of to offer. Smart people have been working hard to understand human behavior for a long time, and it seems unwise to ignore the wisdom that has accumulated from this effort. My hope is that this book will offer you some of that wisdom in a way that is easy to understand.

From social scientists, I’ve also seen two common misunderstandings. First, I’ve seen some people write-off the entire idea of social research using the tools of the digital age based on a few bad papers. If you are reading this book, you have probably already read a bunch of papers that uses social media data in ways that are banal or wrong (or both). I have too. However, it would be a serious mistake to conclude from these examples that all digital age social research is bad. In fact, you’ve probably also read a bunch of papers that use survey data in ways that are banal or wrong, but you don’t write-off all research using surveys. That’s because you know that there is great research done with survey data, and in this book, I’m going to show you that there is also great research done with the tools of the digital age.

The second common misunderstanding that I’ve seen from social scientists is to confuse the present with the future. When assessing social research in the digital age—the research that I’m going to describe in this book—it is important to ask two distinction questions:

How well does this style of research work now?

How well will this style of research work in the future as the data landscape changes and as researchers devote more attention to these problems?

I have only gone through parts of the book (and yes, I did go beyond the preface). But from what I can see, it is a must read for those who are interested in digital technologies and the new frontiers of social research. And while reading it, why not respond to Matthew’s generous act by providing some comments? You can access the book here.

 

Three New Papers (and a presentation) on Civic Tech

CaptureFMS

This blog has been slow lately, but as I mentioned before, it is for a good cause. With some great colleagues I’ve been working on a series of papers (and a book) on civic technology. The first three of these papers are out. There is much more to come, but in the meantime, you can find below the abstracts and link to each of the papers. I also add the link to a presentation which highlights some other issues that we are looking at.

  • Effects of the Internet on Participation: Study of a Public Policy Referendum in Brazil.

Does online voting mobilize citizens who otherwise would not participate? During the annual participatory budgeting vote in the southern state of Rio Grande do Sul in Brazil – the world’s largest – Internet voters were asked whether they would have participated had there not been an online voting option (i-voting). The study documents an 8.2 percent increase in total turnout with the introduction of i-voting. In support of the mobilization hypothesis, unique survey data show that i-voting is mainly used by new participants rather than just for convenience by those who were already mobilized. The study also finds that age, gender, income, education, and social media usage are significant predictors of being online-only voters. Technology appears more likely to engage people who are younger, male, of higher income and educational attainment, and more frequent social media users.

Read more here.

  • The Effect of Government Responsiveness on Future Political Participation.

What effect does government responsiveness have on political participation? Since the 1940s political scientists have used attitudinal measures of perceived efficacy to explain participation. More recent work has focused on underlying genetic factors that condition citizen engagement. We develop a ‘Calculus of Participation’ that incorporates objective efficacy – the extent to which an individual’s participation actually has an impact – and test the model against behavioral data from FixMyStreet.com (n=399,364). We find that a successful first experience using FixMyStreet.com (e.g. reporting a pothole and having it fixed) is associated with a 54 percent increase in the probability of an individual submitting a second report. We also show that the experience of government responsiveness to the first report submitted has predictive power over all future report submissions. The findings highlight the importance of government responsiveness for fostering an active citizenry, while demonstrating the value of incidentally collected data to examine participatory behavior at the individual level.

Read more here.

  • Do Mobile Phone Surveys Work in Poor Countries? 

In this project, we analyzed whether mobile phone-based surveys are a feasible and cost-effective approach for gathering statistically representative information in four low-income countries (Afghanistan, Ethiopia, Mozambique, and Zimbabwe). Specifically, we focused on three primary research questions. First, can the mobile phone survey platform reach a nationally representative sample? Second, to what extent does linguistic fractionalization affect the ability to produce a representative sample? Third, how effectively does monetary compensation impact survey completion patterns? We find that samples from countries with higher mobile penetration rates more closely resembled the actual population. After weighting on demographic variables, sample imprecision was a challenge in the two lower feasibility countries (Ethiopia and Mozambique) with a sampling error of /- 5 to 7 percent, while Zimbabwe’s estimates were more precise (sampling error of /- 2.8 percent). Surveys performed reasonably well in reaching poor demographics, especially in Afghanistan and Zimbabwe. Rural women were consistently under-represented in the country samples, especially in Afghanistan and Ethiopia. Countries’ linguistic fractionalization may influence the ability to obtain nationally representative samples, although a material effect was difficult to discern through penetration rates and market composition. Although the experimentation design of the incentive compensation plan was compromised in Ethiopia and Zimbabwe, it seems that offering compensation for survey completion mitigated attrition rates in several of the pilot countries while not reducing overall costs. These effects varied across countries and cultural settings.

Read more here.

  • The haves and the have nots: is civic tech impacting the people who need it most? (presentation) 

Read more here.