Posted: March 16th, 2015 | Author: Chris Berendes | Filed under: Civic engagement, Framework, Metrics | 3 Comments »
What decisions do we face?
Now that we’ve sketched out what one type of public engagement does and how it does it, we can get a better handle on the kinds of decisions that will arise as we manage a public engagement process.
Image: Aleksander Markin
Dials and indicators are useful if …
In the middle of a public engagement project, the most basic decision a manager faces is “are we done yet?”.
In considering projects in retrospect, for instance if we’re considering what methods or consultants to use for an upcoming assignment, the basic question might be “was the project successful?”.
These big questions break down into lots of little questions, as we can see in our bridge example.
Is more engagement work required to create the needed level of long term support across stakeholders? Can we count on the bridge’s neighbors to see the project out, in spite of the disruptions we expect as a result of construction? What about unrealistic expectations: Have the overly rosy hopes for rush hour traffic reductions been corrected? And what about perceived unfairness: Are the city’s taxpayers likely to continue to fund bridge maintenance though the bridge benefits primarily commuters?
Questions like these could be addressed by polling, surveys, and interviews of the relevant stakeholders.
Of course, since the goal is long-term stakeholder support, the proof of the pudding is whether – five, ten, and twenty years hence – the support is there. Research centers and foundations that study public engagement should revisit past projects to determine how well stakeholder support was sustained.
As we manage each of the five component processes as the public engagement effort proceeds, we are continually making one important decision:
Have we done enough at each particular stage to allow succeeding stages to be successful?
This is a broad topic, but I’ll illustrate the approach with questions that could, in turn, drive metrics.
OUTREACH: Are we reaching cyclists as well as commuters, low income as well as middle income residents? Does the sample group pulled into the engagement process match the larger stakeholder population in key characteristics? As this larger population changes over time, can we pull the right kinds of new members into our sample group to stay in synch? Have we pulled in enough participants for subsequent processes to succeed, e.g. for a survey to be statistically reliable?
These questions can be answered by demographic surveys of participants, compared to polls of the underlying population of stakeholders.
SOCIAL SURVEY: Can we use our survey of commuters, our interviews with cyclists, our polls of taxpayers to design relevant education efforts and anticipate the key issues in the negotiation phase? Are we assessing all stakeholder groups in a reliable way? Are we following the relevant best practices from statistics, ethnography, and so forth?
EDUCATION, INFORMATION, PUBLIC RELATIONS: Do our presentations to automobile association members in fact bring commuters up to speed about the different needs of cyclists? Do users engage with our website in enough depth to understand the uncertainties in the bridge construction project? Do the bridge’s neighbors have a clear sense of how construction will affect them?
NEGOTIATION: Is the negotiation phase structured to address the concerns we’ve uncovered in the social survey phase? Once the negotiation phase has concluded, are taxpayers ready to support the bridge? Are cyclists comfortable that they’ll be able to use the bridge safely? If we’ve added a park project to compensate the bridge’s neighbors for the construction impact, does the neighborhood understand and accept the relationship between the park and the main bridge project?
OPENING UP: Once we’ve opened up the process, are taxpayers who weren’t directly involved in the previous four phases as supportive of the bridge project as taxpayers who participated in the negotiation? Are the bicycle activists we didn’t reach with the initial public education campaign comfortable that they too will be able to use the bridge safely? Are the stakeholders who participated in the public engagement process directly and those who learned of the process through our dissemination similar in their degree of project understanding and support?
Metrics, Decision-Making, and an Orientation to Results
… they get us where we’re going.
This post demonstrates the value of the view of public engagement laid out in the previous two posts. By fitting public engagement into the larger picture of public infrastructure projects, we have a context for considering the kinds of decisions that will need to be made, and thus what kinds of questions and metrics will be useful.
Posted: March 12th, 2015 | Author: Chris Berendes | Filed under: Civic engagement, Framework, Uncategorized | 7 Comments »
How Public Engagement Achieves its Goals
This past Monday, I laid out how public engagement bolsters long-term stakeholder support for large infrastructure projects by creating more realistic expectations and reducing perceptions of unfairness.
In this post, I’ll argue that achieving these results requires five processes:
- Reaching out to the full diversity of stakeholders to create a representative sample with whom we can work directly,
- Assessing the sample’s experience and understanding,
- Informing and educating the sample,
- Negotiating among stakeholders in the sample, and finally
- Opening up the process to include all stakeholders.
OUTREACH: Stakeholders are diverse. In planning and constructing the bridge, bicyclists can’t speak for the car commuters who in turn can’t speak for the construction workers or the taxpayers. Many lead busy lives, like single parents, students juggling studies and work, or older people who stay involved in spite of physical challenges. Some, like commuters, live and work far away. All will have to be brought into the process, and it will take thought and effort to do so.
SOCIAL SURVEY: Stakeholders are generally much more diverse than the project team. It’s difficult to know, in advance, what experiences, skills, and expectations various groups bring to the engagement process and what they know about one another. One way or another, we have to find out, through polls, interviews, focus groups, public meetings, and similar activities.
INFORMATION: Once we’ve determined what the gaps are, we have to fill them in. For instance, we’ll show drivers what cyclists need to share the road safely, describe construction processes and schedules to residents, so they know what to expect, and bring taxpayers up to speed on the advantages and disadvantages of levying tolls to pay for construction and maintenance. We’ll achieve this through guidebooks, video, websites, discussion, among other ways.
NEGOTIATION: Information begins to address perceived unfairness, but generally more is needed. The city pays for the bridge, but the bridge serves commuters who pay income taxes in the adjoining state. The bridge’s neighbors will bear the brunt of the construction process and the long term increase in traffic without getting commensurate benefits. Negotiation may be required to determine side arrangements, e.g. bridge tolls, commuter taxes, a new park to compensate the bridge’s neighbors, that will draw support from enough stakeholders to underwrite the long-term success of the bridge. In public engagement, these negotiations are often informal, structured as dialogue and deliberation.
OPENING UP: Bridge project stakeholders number in the hundreds of thousands. Even if we’ve reached what’s considered to be “large numbers” in the previous four steps, it’s unlikely to be more than a few thousand. We need the long term support of a much larger proportion of stakeholders. So we must open up the process to reach all stakeholders, well beyond the sample. This is generally achieved through advertising, public service announcements, and large scale events that draw media attention.
So far, we’ve assumed that the public engagement process doesn’t affect the planning, construction, operation, and maintenance of the bridge directly. But, of course, modifying these and other aspects of the bridge project may improve stakeholder support. E.g. a bike path can be added to accommodate local cyclists, the building schedule may be modified to reduce impact on surrounding neighborhoods, and so forth.
Project tuning can, in turn, affect each of the components of public engagement. Changing the bridge design so that it accommodates only cyclists and pedestrians may remove commuters as a stakeholder group, and thus reduce outreach requirements. (Though commuters may have something to say if they expected a new bridge to ease their morning and afternoon travels.)
If we increase the scope of the project, by adding a bikeway to what was before a bridge designed only for cars and trucks, outreach requirements increase. The requirements for each of the following phases may also become more complex.
How do we know that our public engagement efforts have been successful? If we’re responsible for just one component process, how do we determine that we’ve done our part? For these and other questions of decision-making and metrics, come back next Monday.
Posted: March 9th, 2015 | Author: Chris Berendes | Filed under: Civic engagement, Framework | 2 Comments »
Over the past decade, some of the wisest, most experienced practitioners in Public Engagement (PE) have puzzled over three interlocked problems:
METRICS: When PE processes can cost six and seven figures, at a cost per person engaged ranging into the hundreds of dollars for a single day event, how can we measure success and justify the expense to skeptics?
ONLINE: How can practitioners apply what they know about face to face PE and use that to leverage social media, email, and other online tools?
SCALING: Public engagement has been proven to be successful in processes involving thousands of people, but how can we expand it, cost-effectively, to reach hundreds of thousands or millions?
These questions have been intractable, in my view, because of something that’s generally a strength in public engagement work: practitioners are skilled in the fine details of the work, the context of organizations, the particulars of urban settings, the personalities of participants. The view “from the trenches” is critical but needs to be in dialogue with a view that allows us to see broad patterns and connections.
To create a foundation for this broader view, let’s consider one type of public engagement and explore why is it needed.
Long-term stakeholder support is required to make large infrastructure projects successful.
A tremendous variety of activities have been described as “public engagement”. Here, I’ll focus on public engagement to support large infrastructure projects that have a significant “real world” component affecting tens or hundreds of thousands of people, over decades. These projects require long-term stakeholder support to be successful. For instance, the success of a new bridge over decades depends on construction funded by taxpayers, continued use by drivers, cyclists, and pedestrians, and long-term maintenance funded by current and future taxpayers.
What Gets in the Way of Stakeholder Support?
Taxpayers may fear additional tax burdens, neighbors may worry about noise, dirt, and risks during construction, and increased traffic in the long-term, cyclists may be concerned that the bridge will be unsafe for them. In short, stakeholders may anticipate various negative consequences.
Stakeholders may also have unrealistically positive expectations. This will increase support in the short-term, but it will undermine long-term support when those expectations are not fulfilled.
Further, the bridge’s neighbors may feel that, while they’re paying most of the taxes for the bridge and suffering through the chaos of construction, commuters who live outside the city and far away from the construction will get most of the benefit. In general, stakeholders who bear more than their share of costs or garner less than their share of benefits will perceive the project to be unfair.
How Do Large Infrastructure Projects Make Stakeholder Support Difficult?
In brief, the lumpy, inherently uncertain, and irreversible nature of project impacts collides with the broad diversity and general inexperience of stakeholders to create unrealistic expectations and grievances.
Large Infrastructure Projects Are Challenging in Three Ways
LUMPINESS: The bridge changes the view for thousands of neighbors, the ride for thousands of commuters, taxes and other costs for thousands of residents. This can’t be tuned to affect each person or group differently. You and I and everyone of our neighbors may consume a different soup at lunch, but we all “consume” the same bridge.
INHERENT UNCERTAINTY: As bridge construction goes on, there may be environmental remediation required that no one anticipated. Or perhaps the bridge is completed successfully, but we find that commuting patterns have changed as more people move into the city or switch to public transit.
IRREVERSIBILITY: It’s expensive to “unbuild” a bridge, and impossible to move it. This is true of infrastructure projects generally. Yet stakeholders generally experience the full impact only after the project is complete.
Project Stakeholders Add Two Further Challenges
DIVERSITY: As I’ve noted, the bridge’s success rests on long term and often intense support from many different groups of people. Investors, commuters, the bridge’s neighbors, construction workers, and people who come to the river to fish and paddle are all important stakeholders.
They “arrive” at difference times, from the planner or civic activist who has tracked the project for years before ground is broken, to the more narrowly focused resident who doesn’t get involved until much later, when the dirt and noise obtrudes into their neighborhood.
Stakeholders differ from one another in age, education, socioeconomic level, income, and so forth, in how they relate to the bridge project, and in how the bridge affects their interests.
INEXPERIENCE: The bridge’s planners, construction workers, and project managers are experts in their work, but most stakeholders will be novices when it comes to large infrastructure projects. And even the recognized experts may lack expertise in other factors that will influence the project’s success, such as commuting patterns in this area, or the past history of surrounding neighborhoods.
Project Challenges + Stakeholder Challenges = Unrealistic Expectations and Perceived Unfairness
Stakeholders don’t have the experience to have realistic expectations, particularly when expectations should be nuanced to reflect project uncertainties. Their diversity combined with the inherent lumpiness of project impacts generates an unfair distribution of project benefits and costs. Further, since the project is large and irreversible, “redos” are impossible. This only heightens anxieties and concerns. Thus, it’s often difficult to garner the long term support needed to develop and implement a large infrastructure project successfully.
So, What Does Public Engagement Do?
Public engagement bolsters long term stakeholder support for large infrastructure projects by reducing unrealistic expectations and perceptions of unfairness.
Taking a step back reveals new possibilities.
The value of this definition is in what it reveals when we explore the process of public engagement, metrics that can guide public engagement decisions, and possibilities for scaling work to reach many more participants.
The series continues this Thursday.
Posted: July 7th, 2014 | Author: Chris Berendes | Filed under: Civic engagement | No Comments »
What happens after a particular public engagement event ends?
Experienced practitioners spend much time understanding the context of of deliberation — interests, demographics, language sensitivities, and objective framing of the issues — and they bring this awareness and much else “into the room” to ensure that “collective voice” reasonably reflects every participant.
But how well and how often is that collective voice heard outside the room, as the wider process continues? Public engagement is, generally, one milestone, and not the final one, in a longer process that may include a council vote, a commission hearing, an executive signature or veto, a referendum, or political maneuvering.
What would it take to ensure “the room”‘s collective voice, developed so carefully, continues to be heard?
Lessons from the House
The US Capitol, just up the street from where I live and work, provides us with some suggestions. Consider a legislative measure that originates in the US House of Representatives. Savvy congresswomen and -men will, in shaping the legislation, consider what it will take for the bill to be passed by the US Senate and then signed by the President.
The House may take a more extreme position in order to gain bargaining leverage. It may package together seemingly unrelated measures in a single Bill to force Senate and Presidential approval, e.g. by adding “widows and orphans funding” to a controversial measure, to raise the cost to a Senator who might otherwise vote against the measure or the President considering a veto.
In addition to negotiation tactics, the House has structural means to ensure that its collective voice is heard: if the Senate modifies a bill, the House must pass those modifications before the final bill goes to the President. Further, the House — and, similarly, the Senate — has influence even after the bill has passed into law. It can affect implementation by adding or withholding funding and by holding hearings and of course, ultimately, by passing new legislation.
Three suggestions for public engagement practitioners
There are at least three ways in which an awareness of and then what? can inform public engagement.
Practitioners ought to explore the realities of the wider context: What comes after the public engagement process and how does that affect the likelihood that the “collective voice of the room” will be heard? And the results of these explorations should be shared with participants.
Practitioners and participants should think more strategically about how the collective voice is expressed. Just as the House may shape a bill not just to reflect its collective voice but also to strengthen the hand of its negotiators in their discussions with the Senate and with the President, we should think more carefully about who might speak and act against the collective voice “outside” the room and how that might be countered.
Transform the broader context.
What if practitioners included a follow-up survey to be taken of all participants, one year after public engagement has ended, to assess participants’ opinions of whether their collective voice was heeded as the larger process progressed? What if a public engagement process included a review of past processes and what happened with their recommendations, rather than starting in a kind of vacuum? What other measures could we recommend to government officials and to the public to “strengthen the hand” of public engagement?
Of course, there are already wise public engagement practitioners who show how some of these suggestions can be addressed in practice.
IAP2’s spectrum of participation (Inform -> Consult -> Involve -> Collaborate -> Empower) allows practitioners, government, and the public to locate a particular process in a wider context and be clear about how much or how little impact participants should expect to have.
NCDD’s engagement streams framework similarly distinguishes between Exploration, Conflict Transformation, Decision Making, and Collaborative Action as the primary purpose of a public engagement process.
AmericaSpeaks’s 21st Century Town Hall planning always included careful thinking about “linking to decision makers” — one strategy to give the collective voice more impact. And the scope and spectacle of larger town halls was in part intended to transform the context by giving the event and its results more impact politically.
(I acknowledge that the public currently disapproves of the House and, indeed the Senate and the President. But the Constitutional structures and legislative strategies touched on above have been in place for more than a century and long pre-date current dissatisfactions.)
Posted: November 15th, 2013 | Author: Chris Berendes | Filed under: Business development, Civic engagement, The business of public engagement | 1 Comment »
Listen, carefully, so that the challenge may tell you how to meet it.
Politicos object to public engagement
Last week, in an email thread [parts 11>, 2, 3], a group of Dialogue and Deliberation practitioners and friends explored some of the barriers they have encountered in attempting to provide public engagement services to government agencies. Among them:
- Officials resist giving up power, in the case at hand, and more generally – if, say, the public discovered that much could be accomplished without the mediation of politicos.
- Officials fear that citizens lack the expertise officials take to be the key if not the only ingredient in sound policy making.
As one person wrote “Officials, directly or indirectly, must still respond to the whole body of voters at election time. If they base their decisions on a small, arguably unrepresentative set of ‘advisers,’ that may not set well with voters.”
The discussion concluded with a thumping affirmation of the power and effectiveness of public engagement, a lament — “We get little chance to do what we do magnificently to any substantive degree” thus “On the average everything in public engagement is getting worse and worse”– and a prophetic call for action:”What can we do about these things?”
I agree – don’t blame the practitioners, the people who put expertise, sensitvity, and experience into the design and implementation of processes that can work so well for participants.
Fine, but these obstacles remain. What to do?
Why? Let’s find out
Insist that the people responsible for business development — for building the bridges between practitioners and government officials — invest similar expertise, sensitivy, and experience into understanding and then addressing the concerns of those officials.
They can begin by finding the legitimate core of each of the issues raised above.
- POWER – A WAY TO GET THINGS DONE
- The public holds an elected leader accountable for a wide range of circumstances, many of which are outside of her control, e.g. winter storms, the economic policies of her predecessor, the actions of other jurisdictions. Reducing her power subjects her and, in turn her constituents, to these forces.
Worse, power is a fleeting thing, an odd amalgam of law, custom, and perception. Ask any President moving toward the middle of his second term about the challenges of staving off the powerlessness of “lame duck” status.
- EXPERTISE – ACCOMMODATING THE BROADER CONSTRAINTS
- The will of the people must often be reconciled with an array of constraints. Good policy on, for instance, healthcare and health insurance should be consistent with medical science, the psychology of incentives, economics, and demographics, among other disciplines.
Yet it’s unlikely that the public is versed in these matters. Indeed, people often have a weak grasp of key facts on even hotly debated issues. Recently, a late night entertainer showed how the man (and woman) in the street dismisses “Obamacare” as too invasive and expensive, then, seconds later, embraces “the Affordable Care Act” (the very same legislation under its original name) as a more reasonable alternative.
- ELECTIONS – MAINTAINING LEGITIMACY
- This concern may be illusory. A little digging suggests that voters often look favorably on public engagement. Chicago Alderman Joe Moore is in a stronger position, politically, for his enthusiastic sponsorship of participatory budgeting. Jay Williams, a planner deeply involved in Ohio’s *Youngstown 2010* effort, used his success there as a platform for a successful mayoral campaign.
So, in the narrow, this may be a misperception — either of the concerns of elected officials or a misperception of risks by elected officials.
However, ensuring that public engagement work is legitimate and is perceived to be legitimate is a plausible concern. The number of people involved in a regional public engagement effort are generally smaller, often by one or more orders of magnitude, than the number of voters in a region.
Apply the remedies we already know
Recasting these objections points to some remedies already at hand. As the prophet mentioned above wrote:”We know how to do public engagement better than anyone, ever has known how to do it.” For instance:
- POWER – PUBLIC ENGAGEMENT GETS THINGS DONE
- In the context of effective public engagement, political officials often gain sigificant “implementation power” by ceding a little decision-making power. There’s a little more discussion at the front-end of a process, but then the bridge gets built, the teachers get hired, or the social injustice is corrected.
- EXPERTISE – PUBLIC ENGAGEMENT ALIGNS CIVIC VALUES WITH RELEVANT CONSTRAINTS
- As the IAP2 spectrum (PDF) shows, it’s possible to design a public engagement process so that it conforms to external constraints. E.g. we don’t brainstorm to determine whether a bridge is sound from an engineering perspective.
More powerfully, public engagement processes can bring acknowledged constraints into a process by educating participants about them. Issue guides can be designed to bring participants up to speed quickly on matters of law and fact, and interactive games can be designed to take those constraints into account.
Most powerfully, an elected official can in effect turn the problem, constraints and all, over to a public engagement process. E.g., the city manager of Redwood City, CA turned over a water conservation mandate to a panel of residents who strongly opposed the city’s initial solution, telling them that if they could meet the state’s mandates in another way, and come in within budget, Redwood City would act on their alternative. The panel came up with a better plan, which was implemented harmoniously.
- ELECTIONS – PUBLIC ENGAGEMENT AS A COMPONENT IN LEGITIMATE DECISION-MAKING
- In those cases where a politician correctly perceives a conflict between re-election and public engagement, there are concrete ways to proceed .
Comprehensive outreach at the beginning of a public engagement process strengthens its legitimacy, by bringing traditionally underrepresented groups to the table, and more closely mirroring the composition of the body politic.
Incorporating traditional electoral measures at the end of a public engagement process creates even broader legitimacy. E.g. the residents of Owensboro, KY began a regional planning process with an AmericaSpeaks 21st Century Town Meeting to set design goals for a revitalized downtown, and then city and county legislative bodies endorsed the decision by voting for tax increases to fund the plan.
And keep listening
Even as they participate in public engagement processes on a particular issue, public officials are following the will of the people on a much broader set of issues, threading multiple needles dictated by science, law, and history, and, often, worrying about the next election.
Business developers, working in support of public engagement practitioners, must work with public officials to understand this wider context, then translate the resulting requirements so that practitioners can apply their vast skills and experience. Those practitioners have already proved their ability to bridge a vast set of gaps to bring people as constituents into public engagement processes.
And public officials are people, too.
Posted: April 26th, 2012 | Author: Chris Berendes | Filed under: Civic engagement, Open Government, Technology, Transparency | 1 Comment »
Is that all there is?
You’ve landed on a Whitehouse.gov petition on an issue that’s close to your heart, and you’re thrilled that’s it’s finally getting some visibility. Of course, you sign the petition, but then you wonder Who are these people? What else can I do? How can I get plugged in?. In way too many cases, the people who started the petition leave you to your wits, and of course to the Google. It’s like a 3am infomercial without the 800-number. What were they thinking?
My inspiration and goad to build tools for the White House “We the People” petition site was the story of an activist who had, in effect, lost her work gathering signatures when she failed to reach the necessary threshold after a month of work.
Along the way, I wondered how often petition initiators added links to their petition text, to provide more information to potential signers or to supplement their work on whitehouse.gov by building a community on a site that gave them more control.
“We The People”-scope
To investigate, I’ve built a live, interactive database of petitions currently visible and open for signatures at the White House. How well are petitioneers using WhiteHouse.gov traffic and visibility to build activist communities? The results aren’t pretty.
90% of the time, you’re on your own
Of the 39 petitions open for signatures this morning, only four include links:
- a request for funding of an MIT anti-viral drug links to a press release providing further information
- a call for legislation implementing various economic and legal reforms (NESARA) links to an activist website and to a religious/New Age Ning community
- a call for increased funding for NASA includes a reference to a website for that issue campaign, and
- a request that the Administration veto any legislation that extends tax cuts for the highest earners includes a reference to MoveOn.Org.
None of the links (actually in plain text, since the petition site doesn’t allow hot links) make it easy to plug in to community. The NASA funding campaign website is focussed and includes further calls to action, but does not provide a community forum or a mailing list sign-up. The NESARA-related websites provide a wealth of information and, via Ning, a community. However, I could not see how I might easily connect with other supporters of the linked petition. MoveOn.org is a major activist community, but nothing on its home page references the current tax-related petition.
So, of 39 petitions, only three provide links that would allow a signer to tap into a larger community, discuss the petition, and monitor progress, and even those three links are muddy.
What if, instead, a petition linked to a well-designed landing page that encouraged people to sign up to track the progress of the petition, support the cause via other actions, and connect with fellow activists. It’s a missed opportunity.
And there’s more!
(The petition overview can be filtered and sorted in many different ways. For instance, you can highlight the backlog of petitions that have met their signature goal but don’t yet have an official Administration response, or focus just on the petitions for civil liberties, human rights, or immigration issues – almost half of the total currently open.)
Posted: March 1st, 2012 | Author: Chris Berendes | Filed under: Civic engagement, Open Government, Technology, Transparency | 4 Comments »
Last week, I used a trial run of a new “petition scraping” plugin I’ve developed to see which states most strongly supported the recent White House petition that requested the Administration to rescind the health care reform contraception mandate for Catholic employers.
Today, I can add a second, opposing petition to the analysis. It urged the Administration to “stand strong” on the no cost birth control requirement. From what I’ve seen on the petition site, this is unusual – some petitions garner few signatures, but very few petitions are arranged in pro/con pairs. We can take advantage of this “natural experiment” to compare state responses on either side of the issue. (Signature data for the “Stand strong” petition can be downloaded at the csv link below.)
Nebraska, North Dakota, Kansas Against; DC Engaged
The outliers, highlighted in red in the chart, are the story.
Kansas, North Dakota, and Nebraska – in the bottom right – showed significantly stronger support for the Rescind petition, at 309, 367, and 487 signatures per million, than the national average of 117. In contrast, their support for Stand Strong was fairly close to the national average of 89 per million – they’re well within the “cluster” on the left axis.
Something even more interesting is going on in DC. At 509 signatures per million, it is the standout supporter for Stand Strong. But notice that, at 229 signatures per million, it looks a lot like Kansas, North Dakota, and Nebraska in per capita support for Rescind. (I’d guess that DC’s intensity reflects the pro-contraception response by longterm residents combined with combined with the response from advocacy groups on both sides.)
This table provides the details for the chart above:
Download as csv file (4k).
The animated map shows how signatures flowed from each state, normalized by its population, with the petition “closing” on February 10. Click on the slider to see how each state contributed signatures starting on February 3.
In many respects, signers responded similarly to both petitions:
|Days to reach 500 signatures (visibility threshold)
|Days to reach 50 states and DC
|Percent of signers not providing a place
Download the “Stand strong” signature details as a csv file (660k).
See the previous post in the series for details on the Rescind petition and more information on the mechanics of petitions and signatures at whitehouse.gov.
A note on statistics
Some of the variation of a particular state’s response with respect to the US will be due to chance, rather than a fundamental difference in this state’s political leanings vs the US. For instance, weather patterns or state preoccupation with a sports event might have reduced Mississippi’s engagement; it might generate more signatures per capita on similar petitions at another time.
Statisticians measure how much an indicator departs from the average in standard deviations. The standard deviation captures the variability of a set of numbers. In the case of the petition signatures, if the response of a particular state differed from the US average by less than two standard deviations, e.g. Mississippi, this could occur by chance more than 5% of the time.
The bottom left quadrant contains all the states within two standard deviations of the US average.
The remaining three states and DC are outliers indeed. On the Rescind petition, Kansas’s response is more than two, North Dakota’s almost three, and Nebraska’s more than four standard deviations above the US average. This would occur by chance less than 5%, 0.3%, and 0.007% of the time — i.e., from rarely to never. DC’s response to Stand Strong is 5.7 standard deviations away from the mean, which would occur 0.00001% of the time by chance.
The outliers, in other words, are radically more engaged in these respective petitions than the rest of the country.
Posted: February 23rd, 2012 | Author: Chris Berendes | Filed under: Civic engagement, Open Government, Technology, Transparency | 2 Comments »
Open government often carries a significant risk to activists: organizing efforts may be “locked up” by Federal agencies, even with the best intentions.
The rewards, and risks, of squirreling away
Consider Terra Ziporyn Snider. Last fall, she initiated a petition on the White House website requesting changes in school start times. Then current White House rules required that she gather 5000 signatures within 30 days in order to keep the petition on the site. She and her supporters encountered various technical problems – intermittent site outages, difficulty in signing up new users — and as the deadline approached, with 1575 signatures recorded, Dr. Snider realized that the fruit of her efforts was about to be digitally vanished, per White House rules.
I was taken by the lengths she went to transplant the list of names she had built at the White House to Move-On. Though these signatures had been gathered through her online organizing efforts, and of course were stored on a hard drive somewhere in the whitehouse.gov domain, she had no way to access these data easily. Instead, she printed out the petition and, apparently, retyped the signatures by hand.
Imagine a New England squirrel storing away nuts for next winter in the trunk of a tourist’s car – it’s secure today, tomorrow, and perhaps even next week, but when the snow comes, the car — and the nuts — are in Florida, and the squirrel starves.
Of course, in the case of White House petitions, the potential of reaching a wider audience for your cause and of getting the attention of the Obama administration may make this risk worthwhile.
But, as Dr. Snider found, sometimes the car drives away, and you’ve got to scramble not to be left empty-handed. How could one reduce the risk, particularly for activists who don’t know or don’t have access to sites live Move-On?
A tool for liberating signatures
Her story was my inspiration to build a page scraping toolset that would allow activists to get the benefits of White House web petitions, while reducing the risk.
As a proof of concept, I set the plugin to work on two recent petitions. The first requests that the Administration rescind regulations mandating that religious institutions provide contraceptive coverage under their employee health insurance plans, even if this is contrary to their religious precepts. The second urges the Administration to “stand strong” in maintaining this mandate.
In this post, I’ll focus on the signature data from the “Rescind” petition. A follow-up post will provide details on the “Stand strong” petition signatures. [Update 3/2/2012: I’ve posted the “Stand strong” results and a state by state comparison.}
If you’re familiar with the mechanics of the White House petition site, skip to the next section for the results.
The mechanics of initiating and signing petitions
Anyone may initiate a petition. Per the rules currently in force, new petitions are visible only to those web visitors who already know the specific petition URL. Once a petition receives its 150th signature, it is listed by the White House in the index of current petitions and can be found by appropriate search terms.
The initiator and all other petition supporters then have 30 days to gather at least 25000 signatures in total. If they fail to reach that threshold within 30 days, the petition disappears, as Dr Snider experienced. If 25000 people “sign” the petition, the White House promises to post a response.
A WhiteHouse.gov login is required to sign a petition. To get a login, you provide your email address, your name, and, optionally, city, state and zipcode. Each petition signature block includes the first name and last initial, the signature’s order in the overall total, the date the signature was provided, and the city and state of the signer, if provided. For instance:
February 22, 2012
Signature # 1,079
Example: Signature data from the “Rescind” petition
From January 28 through February 10, the Rescind petition gathered 29,127 signatures. The inset below displays the raw data gathered by the page scraping toolset, which can also be downloaded by the link at the bottom of the table.
This animated map shows how the signatures flowed in from January 28 to February 10, normalized by the population of each state.
The shading of each state shows the number of signatures provided by that state, to date, for each million residents, based on 2010 Census results. States that “boxed above their weight” are shown in darker green; states that were underrepresented, in very light green. Since the figures are cumulative, each state grows somewhat darker over time. Click on the slider at the bottom of the map to show how the signatures accumulated over time.
Signature data for the Rescind petition
Download as csv file (1.1 mb).
More detailed analysis shows
- that the petition took off pretty quickly – two signatures on day one, 32 on day two, 1,386 on day three (reaching all fifty states and DC)
- that state participation varied dramatically: Nebraska provided more than 480 signatures per million residents, North Dakota more than 360, while Mississippi provided just 31 signatures per million inhabitants. (Overall, across the US, 127 signatures were provided per million residents, if we assume that all signatures came from the United States.)
- 4,951 signatures were provided by people who did not state their location — which shows up as NULL in the data.
Next post: How does the signature flow for the “stand strong” petition compare? What surprises do we find when we compare states’ activity on these two opposing petitions? Are these signers as reluctant to provide place information as the Rescind signers?
A postscript for data geeks
- Each signature is numbered. The same sequence number may appear multiple times on the signatures page, indicating, presumably, two or more different people who signed almost simultaneously. For instance, if the third, fourth, and fifth signers all acted simultaneously, the first six signatures would be numbered 1, 2,3,3,3,6.
- One Rescind signer was particularly eager: he signed twice in quick succession, so his name and place show up twice in a row with the same sequence number. This anomaly can’t be tracked by the software currently, leading to a database count of 29,126 signatures, one less than shown on the White House site.
- In addition to the “no location” signatures, 28 people provided place information for military post offices or mistyped their place information.
- The scraping tool captures the first name and initial of the signer as shown on the petition, but I’ve omitted this column from the download for the moment.
Posted: September 18th, 2010 | Author: Chris Berendes | Filed under: Civic engagement, crowdsourcing, Open Government | 1 Comment »
When I venture into new arenas – social media, crowdsourcing, online engagement – I’m fed by discussions with people who are doing this work. But, too often, the conversation gets bogged down by arguments about definitions.
For instance, in a chat about crowdsourcing recently, someone offered online commenting on government regulations as an example, then someone else – it might have been me – almost derailed the conversation by asking whether commenting really counted as “crowdsourcing”.
The problem: the person who’s just run a rule-making process that received thousands of comments knows what she did, how participants responded, what seemed to work, what fell short. But she – and all the rest of us – are clueless about whether this really is crowdsourcing.
In general: when we’re describing an experience or process we’ve experienced, we know what we’re talking about. But when we debate whether this experience is an example of crowdsourcing, we don’t.
Bold claim on my part, I know. And, one day, we’ll be able to agree, quickly, on whether my friend’s process was crowdsourcing, or collective intelligence, or prediction markets, crowd-storming, or peer production or something else entirely.
Why? We’ll, collectively, have more experience, and we’ll have come to (some) agreement on who the authorities are, and there’ll be some benefit to the definitions.
But today we’re still groping, learning what’s been done, identifying new combinations that haven’t yet been tried but look promising.
In effect, we’re crowdsourcing the definition of “crowdsourcing”
(And if you’re thinking that this advice applies to discussion about “social media”, “Gov2.0″, and “online engagement” as well, you’ve got my point exactly.)
Posted: May 3rd, 2010 | Author: Chris Berendes | Filed under: Civic engagement, Metrics | 3 Comments »
Start with “how do I know it’s working?”, not “what can I count?”
E-democracy.org has 16 years of experience in creating and hosting online civic forums via email and the web.
I participated in an email thread there recently that began with this question:
“How would you measure engagement on public issues via interaction in online spaces?”
It led to a lively exchange, but it left me unsatisfied. “Measure” and “metrics” create a kind of tunnel vision, focusing attention on what’s easy to count on the web (hits, number of posts per day, number of posters, pageviews, unique visitors), and away from our understanding and experience of online forums.
As it happens, Steve Clift, the founder of e-democracy, recently reported the results of a grant to create four online forums centered on a number of towns in rural Minnesota.
The report discussed a number of ways that forum discussions had affected their communities:
- A discussion about new regulations regarding the handling of household wastewater led the county’s director of planning to reconsider regulatory language.
- Discussions in a second forum generated stories in the local newspaper.
- Participants used a third forum to get advice on how to fight the city’s withdrawal of their permit to raise chickens in their backyard.
- Participants in another e-democracy forum, not covered in this report, used it to organize their response, including meetings with city officials, to a mugging near a transit stop. The transit agency’s community outreach staffer joined the forum, and then the discussion, based on their actions. The president of the local neighbors group participated as well.
- The report also noted that local government websites had linked to some of the forums and, in one case, the local government had sponsored the start-up of the forum.
These stories suggest a variety of measures that could be applied to e-democracy forums:
- How many local government officials are forum members? What percentage of all local government officials are members?
- How many of these post, and how often do they post? How many of these posts reflect concrete changes in behavior (meetings scheduled, agenda items added or changed for official meetings, changes to legislation or regulation)?
- How many discussions have been used to organize meetings in the community or with government officials?
- How many discussions have received links from local or regional newspaper websites?
These measures all need development, and we could likely find booby traps in each. But consider the conclusion of two (hypothetical) reports on community impact of a (hypothetical) forum in Smallville:
Based on web metrics
The forum received 500 pageviews from 200 unique visitors per month.
It had a membership of 128 at the end of the year.
The average length of a visit was 3.5 minutes.
The average visit includes 4 pageviews.
Based on “grow your own” measures drawn from these stories
Two of the five members of the city council became members of the Issues Forum. One joined after a discussion of the city’s response to last winter’s monstrous snowfalls erupted in the forum and led to a delegation of forum members testifying before the council about ineffective snow plowing.
One councilperson posts at least once a week. Three times over the course of the past year, she has responded to questions in the forum or asked for further information. She also introduced an amendment to a local zoning ordinance based on concerns raised in the forum.
A dozen forum members from Smallville South used the forum to organize a meeting with the transportation department to discuss the pothole problem on local streets. They are using the forum to follow-up on the meeting, and an official from the transportation department posts updates on actions taken relating to potholes at least once a month.
The Smallville Gazette has used the forum to solicit feedback on its coverage of the city council.
On average, one out of four of its weekly online issues include a link to one or more Forum discussions.
Which would be more likely to persuade you that Smallville’s online forum had actively engaged the community?