Wednesday, October 16, 2013

Who Counts? Grappling with Attendance as a Proxy for Impact

When you count attendance to your museum, do you include:
  • people who eat in the cafe?
  • people who rent the facility for private events?
  • people who engage with your content online?
  • participants in offsite outreach programs? 
  • volunteers? 
This summer, the St. Louis Post-Dispatch published the kind of "how sausage is made" story that rarely gets written about the arts. It's about museum attendance and how the five big, free museums in St. Louis count it. There's quite a range. Summertime concerts at the history museum? Those count. Outdoor movies at the art museum? Nope. At the St. Louis Science Center, the focus of the article, there was a particularly creative perspective on attendance, including numbers for offsite board meetings, parades where staff made a showing, and attendance at a school next door. The only form of engagement lacking in the article is online participation--which for many museums, could yield the highest numbers of all.

Even if you consider some of these counting strategies to be egregious, the basic question is still relevant: who counts? When I reflected on our museum, I realized we have some inconsistencies in how we calculate attendance. For us, annual attendance includes programmatic activities onsite and off (about 10% of our programming is conducted at community sites). That means daytime visitors, event participants, school tours, and outreach program participants. It does not include facility rentals, meetings, fundraising events, nor people who might see us at a community event but not directly engage.

What's missing from this picture? I think you could reasonably argue that we should be counting:
  • researchers who come in to access information in the archives
  • people who rent the museum for a private event that includes a curator/artist tour of exhibitions
  • kids in museum summer camps
  • people who visit the historic cemetery that we manage
  • people who talk with us online about historic photos we share or blog posts about the collection
And then there are the weird inconsistencies. Why do we count participants in an art activity for families at a community center but not members of the Rotary Club to whom I give a presentation about the museum? Why do we count visitors who tour the galleries chatting with their friends but not visitors who tour the galleries chatting with a staff member (i.e. as part of a meeting)? 

This doesn't even get to the potential parsing of people's intentions. If someone comes to an exhibition opening for the free food, do they count? If a kid gets dragged to a museum with their parents, do they count? If someone has an epiphany about art outside the museum, do they count?

Probe too deeply and the question gets absurd. The more important question is not WHO counts but WHAT counts. Internal to an individual museum, relative attendance--changes over time or program--can yield useful information. But if you try to make meaning out of attendance comparisons across institutions, you start juggling apples and oranges. While many institutions separate attendance by program area, I don't know of any that separate attendance into "impressions," "light engagement," "deep engagement," etc. - categories that might actually have meaning. 

What is meaningful in the context of achieving our mission? That's the number we should be capturing.

The Relationship Between Attendance and Impact

How can we measure impact? That's a huge question. Let's look at it in the narrow context of the relationship between attendance and impact.

What is the information value of attendance? Attendance does a good job representing how popular an institution is, how used it is, and how those two things vary over different times of day, days of the week, times of year, and types of programs.  

But does attendance demonstrate mission fulfillment? Unless your mission is "to engage X number of people," probably not. For some institutions, like the MCA Denver, attendance is seen as a very poor measure of impact. But for almost all museums (even MCA Denver), attendance is correlated with impact in some way. 

For attendance to be correlated with impact, you have to find a way to articulate a theory of change that connects attendance to your mission (inspiration, learning, civic participation, etc.). And then, you have to be able to calculate a conversion factor that relates the number of people who attend to the number for whom the mission is fulfilled.  

Imagine managing a shoe store. Your mission is to sell shoes. Attendance is the total number of people who walk in the door. Of those people, 10% actually buy shoes. That means 10% is your conversion factor; if you want to sell 5 pairs of shoes, you need fifty people to walk through the door.

Now let's say your mission isn't just to sell shoes, but to build relationships with customers who will love your shoes and buy more of them in the future. Maybe the conversion factor from first sale to repeated sales is 20%. Now you have fifty people who walk through the door, five who buy shoes, and one who will be a longtime customer.

Now let's turn back to museums. The St. Louis Science Center's mission is to "ignite and sustain lifelong science and technology learning." What's the conversion factor from a single visit to that mission? 

I'd start by splitting the "igniting" from the "sustaining." You could argue that any single visit or interaction with the Science Center--at the facility, out in the community, online--could have the spark of ignition. But sustaining lifelong learning requires a different level of commitment. That count could include people who are visitors/members for 10+ years. Or volunteers who participate on a weekly basis. Or students who visit at some point and go on to careers in science and technology. 

It's not easy, but the museum could define the indicators that it considers representative of sustained learning. It could count those incidences. With some effort, you could calculate conversion factors from igniting to sustaining for each major program area. And if you knew the conversion factor for general attendance from igniting to sustaining, you could actually generate an estimate of how many of the kids zooming around the facility are likely to sustain a lifelong interest in science.

Looking at it in this way would also allow institutions to expand beyond reductive "all about attendance" approaches to demonstrating impact. You could argue that some of the most important work of "igniting and sustaining lifelong science and technology learning" has nothing to do with attendance to the science center. It might involve producing ad campaigns linking science to community issues, or advocating for job training programs in technology, or designing curriculum for community colleges. And again, if you could designate indicators for the kinds of learning impacts possible through these efforts and the conversion factors from igniting to sustaining, you could count and present them. 

So perhaps the St. Louis Science Center's annual report could look like this:
"Our mission is to ignite and sustain lifelong science and technology learning. We know that not every spark leads to a blaze, so we focus on igniting as many sparks as possible and making strategic investments in programs that are likely to sustain learning for the long term. 
We ignited science and technology learning this year through ongoing exhibits, educational programs, outreach in the community, and online interactions, which reached 3 million people. These sparks grew into sustained lifelong learning for at least 400 people, who got involved in local technology hobbyist projects, who pursued careers in science and technology, and helped us facilitate learning experiences as volunteers at the museum. 
We also focused this year on working with the countywide adult education agency to start an intergenerational science program at three senior centers throughout St. Louis. While this program only involves 40 people per site, all of them are participating in the kind of deep science engagement that is proven to lead to lifelong science and technology learning."
Too unwieldy or unorthodox for funders? Maybe in the beginning. But in an age of nonprofit accountability and increasingly sophisticated evaluation strategies, I think this kind of approach could be useful. What do you think? 
blog comments powered by Disqus