With IMLS funding, the Measurement of Museum Social Impact study has spent months working with museums across the US to validate a social impact instrument. Watch this recorded session from the 2023 AAM Annual Meeting to hear about the study’s goals and how they’ve been achieved. Then learn about results from the social impact survey and how two participating museums are using the findings.
My name is Michelle Mileham. I use she/her pronouns. I am the project manager for the Museum Social Impact Study, MOMSI. We’re going to share a little bit about that. The project is just in its final stages, and we’re really excited to be coming to a conclusion and wrapping up with all of you with some results and a great activity at the end of our time together. I am joined this morning with two of our host museums. I know the schedule said three, but unfortunately our third couldn’t join us today, but I’m going to let them introduce themselves.
Hi everyone. Thanks for joining us this morning. My name is Sabre Moore and I’m the executive director of the Carter County Museum in Ekalaka, Montana.
Hi, I’m Jen Kindick. I’m the museum education specialist at the Molly Brown House Museum right here in Denver.
All right, thank you both. A little overview of what we’re going to do today. I’m going to share some project background. If you’ve joined us for a session before, a lot of it will sound familiar, but we are going to try and parse it down to as little as possible just to make good use of our time. Then we’re going to share out some of our findings because I’m really excited we have findings to share with you all from this national study. And then we’re going to spend our last 30 ish minutes working on an activity with a toolkit that is the product of this project.
The measurement of museum social impact, or MOMSI as we like to call because it is a mouthful, otherwise started with two pilot projects in the state of Utah. And we had really great success with both of those projects, both in the data we were seeing and how the museums who were part of those studies were using the data are planned to use the data. And the project team thought, “Well, is this something that could be done on a national level? How do we take this to even more museums that are outside of the state?” With funding from the Institute of Museum and Library Services, IMLS, we received a national leadership grant in the research category to scale up and test the museum social impact tool across the US. Because we were a research study, we had really specific goals and objectives.
One of course, we were measuring museum social impact nationally. That’s just a byproduct of what we were proposing. Our host museums that participated also all got individual data. But in that process, all of the data we received from participants across the country helped us to validate the museum social impact survey. What we mean by this is that when we publish a survey and share it out with all of you and research colleagues, is that we want to make sure the instrument we’re publishing is both valid and reliable so it’s consistently measuring the same thing and is measuring what we intend it to measure.
And then that survey and tool is a big part of this social impact toolkit that like I said, is the product of this three year long project. We are just in this process now of developing and will by June 30th publish this toolkit. And that surveys a big part of it. But the lived experiences of our host museums, the two here and the other that have joined us at other conferences, all of their experience and perspectives of running a study like this are included in that toolkit so it’s as useful of a product as possible when it rolls out.
I always like to shout out to all 38 our host museums here. We did have an open application process. I’ll show the timeline in a minute, but we had about 70 museums apply to participate. And because of staff capacity on our end, running the project as well as budget, we could accept 38 of them. About half of the museums that applied were welcomed into the study. And you may notice from these logos up here, we have museums representing every region of the US. We also have different museum content types, everything from art museums to zoos and aquariums. We have historic houses, obviously we have science centers, children’s museums, botanical gardens.
If they were represented in the applicant pool, we tried to pull in as much variety as we could. And then these are all very different sizes of museums. Of course we have really big institutions in here. They have research and evaluation staff embedded in their work and their site and three, 400 staff. And we also have really small entirely volunteer run museums. Again, that goes back to one of our goals of scaling up and “What does this look like and how can something like this be used at any type of museum?” We really wanted to try and get as many various perspectives as possible, those were really valuable for us to include.
This is what we asked of our host museums. We did require them to be open to the public. We didn’t count virtual experiences as part of the study, and that is not because we don’t see those as really valuable learning experiences and how important they were amidst COVID, but we were again, scaling up to pilot studies that had happened prior to COVID. When we wanted to validate that instrument, we wanted the museum visit experience to be as similar as possible to those other studies. That’s the only reason we made that decision. Our host museums did have to recruit at least 100 participants, and we’ll talk a little bit about that later when our museum share out. But that number came from this idea a backwards design approach is that we know from statistical analyses, we wanted 40, ideally 60 surveys to be completed for each site.
We know as social science researchers, people often sign up for things and then do not participate in that thing. We knew we would have some attrition. And we thought, “Well, if we want to end with 60 and about half are going to drop out, let’s have them recruit 100.” It felt safe. And then the final thing is that all of our museums had to allow participants, the people they selected into the study and at least one guest of each of those participants to visit the museum three times or up to three times free of charge. If they had an admission fee, they had to waive that essentially six times per participant. Here is our timeline, and I like to just show this because it really indicates how long we have all been in partnership in this study. It’s been a very long journey, about a year and a half now since we selected our museums about two years.
It’s been a really long process to get where we are. Like I said, we were recruiting museums in early 2021. There was a lot of uncertainty at that time. We were happy that we got 70 applicants, were able to select those 38. By that summer and throughout the fall and winter of 2021, all of our host museums were recruiting participants. Our project team offered language for e-newsletters and social media. We provided pictures, logos. We also managed the link that was the form for participants to sign up to participate. Our host museums had to communicate it to each of their audiences and their communities. But the link that they shared all came back to our project team. That was something we managed for all of them. And then from fall of 2021 through August of 2022, it’s what we call our study open period.
It’s just the timeframe that visitors could come complete those three visits. Besides limiting it to three visits, we didn’t have any other parameters for participants. They could visit when they wanted during normal business operating hours, stay as long as they wanted, experience what they wanted it, we really wanted it to be visitor driven experience. And then when they either completed all three visits like we had asked, or at least one visit when the study period closed in August, we sent participants an email with a link to the social impact survey. Again, that was something our project team managed. We were tracking with help from the host museums, either those three visits, but anyone who did one visit received that social impact survey that came back to our team. We’ve been deep in data analysis so that we can be here today to share out all of those results.
Measuring social impact, what exactly does this look like? I mentioned who, like I said, we asked our host museums to recruit at least 100 participants. The median recruitment across all 38 sites was 143 participants. Right at the middle, more host museums offered or were able to recruit more than we had asked for. You’ll see that median accepted into the study ticks down just a little bit to 125. That is really just because we had some incomplete applicants/some that were multiple from each household, and we wanted to select only one per household. When you really look at it, we’re focused on that 125 number as the median for the study. And then you can see each of our representative host museums up here, Carter County Museum, recruited and accepted 68 and Sabre can talk a little bit more about that number later.
That was something we were deep in conversation with, but again, because we wanted to test this at museums of different sizes and in different regions, that was something we knew was likely to happen and wanted to see how that played out. And then Molly Brown House Museum accepted 97. Again, right at that threshold of what we had had asked. Looking at the number of visits. Like I said, if they completed at least one visit, we sent them the survey. This gives you an idea of that attrition. Like I said, we thought we were in this really good place if we asked for host museums to recruit and accept 100 participants. Project Median, again, 125 participants were accepted into the study. 32 completed at least one visit, much higher than the 40%, 50% we thought would drop out.
Again, that’s across all 38 sites. That’s something we can’t explain, where we weren’t really prepared to answer why people aren’t participating. We have some anecdotal evidence of maybe why, but that was again, just a harsh reality of the time we’re in. And again, we’re trying to figure out how this can actually work at a national level. And that was one indicator. Carter County Museum and Molly Brown House, though very different from that project median. Carter County, 67 out of those 68 completed at least one visit. We were thrilled. It was a great case study for our project here. And then Molly Brown House was what we were anticipating. They had 59 out of that 97, what we would expect to see. And at the end of the project, it ended up being that we sent that survey to just over 2,500 participants across the country, and then we got just over 2000 returned to us.
That’s an 80% completion rate, which is really good. If anyone in the audience does audience research, we were pretty happy with that 80% completion rate. At least people who were invested really came thought through. Social impact, what exactly were we measuring? We recognize that there’s a lot of ways to think about social impact and define this work, for the purpose of this study, we define social impact as, “The effect of an activity on the social fabric of a community and the wellbeing of the individuals and families who live there.” And we measure this through four long-term outcomes, continued learning and engagement, health and wellbeing, strengthened relationships and valuing diverse communities. I’m just going to give you a little preview of what this looks like on the survey. This is on our website and it will of course be a big part of the toolkit.
Continued learning and engagement, of course that’s the long-term outcome. We don’t just ask participants that. We have these indicator statements, which are those bullet points, those are the statements that participants are responding to. Things like, “I wonder about how things work, I regularly visit local museums,” which could include zoos, gardens and so on. Increased health and wellbeing is really about the physical and mental wellbeing of the participant, being able to bounce back from adversity, being open to new ideas. Strengthened relationships is really their network of friends and family. They responded to statements like, “It’s easy for me to develop social relationships. I make it a point to spend time with friends and/or family.” And then finally, valuing diverse communities was really the diversity that they see in their community. And this was really purposefully vaguely worded. We didn’t want to define what we meant by diversity.
This is things like, “I learn new things from people who are different from me. I understand how cultures are similar and different.” There were about 50 statements like this that participants responded to, and we used a retrospective post and pre-design. What that means is that they received the survey once and in that same survey responded for what they thought or felt after their visit and before their visit. We measured both of those in one survey and we’re not going to get in the weeds, but I’m happy to answer questions later about that. Along with all of those closed ended questions, we also had these six open-ended questions that participants could respond to. These top three are ones that our project team analyzed, they’ll be part of the report that we share out. We analyzed them for each of our host museums.
We felt like they were the most closely tied to the idea of social impact and how we defined it, “How does this museum benefit your community? How did participating in this study change your perspective of museums or cultural sites? And in what ways, if any, did host museum change the way you interact with others?” Again, those are really tied to our construct and ways of thinking about social impact. What did we find? I’m happy to share this beautiful slide that if we look at all of those closed ended statements, you can see each of these boxes as one of the four long-term outcomes. 10 out of 10 in continued learning and engagement, 16 out of 16 in health and wellbeing, valuing diverse communities, 10 out of 10 and strengthened relationships, 12 out of 12. Go us.
This is really exciting to see. Again, this is like the aggregate, the 2000 responses that we see. I say that and we’re going to share other data here in a second of that, the more data you have, the easier it is for statistics to find this result. While this is really exciting for us as a field, it is because the sample size was there and easy to find the significance. We’re actually going to show you, and I want to applaud Carter County Museum and Molly Brown House Museum to share their individual museum findings so that when you do this work, you get the context and can see what you might find in data with less than 2000 sample size.
Carter County Museum had significant impact in all the indicators as well. But as Michelle was saying, there were a few that were a little lower than the aggregate. What that means for us is just a really cool opportunity to integrate this into our strategic planning. How are we going to do better at answering health and wellbeing, valuing diverse communities and strengthening relationships? I know that our museum actually did really well on the strengthening relationships. It says 10 out of 12, but we did very well comparatively. And I think a lot of that speaks to the rural community in which we were apart. And people in rural communities in particular very much have the advantage of having to be tight-knit, having to rely on their neighbors whether they like them or not, sometimes. As Michelle said earlier, 67 of our 68 completed their visits.
We also had a 96% completion rate in our survey, and that was just that accountability and the fact that I knew everybody that took the survey, even if they weren’t from nearby, I knew who they were. I could find them and I could tell them to get on it. We were really excited about these. And one of the things that the health and wellbeing one in particular that we didn’t score well on is that people in our community did not feel that they had control over their lives. Now, some of that is probably a symptom of it being during the latter part of COVID, but it’s also rural communities, and ours in particular is very agricultural based, where you’re beholden to the weather and to what’s happening around you, very interesting.
Here’s ours from the Molly Brown House Museum. You can see they’re quite different from the national as well as from the Carter County Museum. We, of course, are very proud of our continued learning and engagement scale hitting eight out of 10. And what we did find change on were the indicators that really do matter to us as a museum and are part of what impact we want to have. For example, “I can see how exploration leads to learning. I challenged the way things are done,” we saw positive change in those statements for continued learning and engagement. For health and wellbeing, “I am confident in contributing my opinion to a conversation.” Valuing diverse communities, “I’m aware of the challenges faced by others with backgrounds different from my own.” And of course, one out of 12, for strengthened relationships, at first, we were like, “Okay, what happened? What’s going on?”
The one that we did find change in was, “I often engage in meaningful conversations with my friends and or family members.” And that really gets at the heart of what we try to do in our self-guided tours, in our guided tours where we try to promote conversations, asking questions, like, “What legacy do you want to leave behind like Margaret Brown? What change do you want to see in your community?” Really thinking about that sort of strengthening relationships was very positive. The other thing I did was delve deeper into the data to see why it was only one out of 12, and basically the majority were giving us fives and sixes saying they already strongly agreed with that outcome before they came to the museum. There really wasn’t a change after. That’s always something you also have to look at it. It looks maybe a little negative here, but when you actually look into the data of the before and after, there’s a bigger story about what was happening with these indicators.
Thank you both. And I will say Sabre alluded to this, the strengthened relationships piece, what you see at Molly Brown House Museum was pretty similar to what most of our host sites saw in their data. That surprised us as a project team because that was a really high long-term outcome in the state pilot. We’ve been thinking about why might that be? And we don’t have the answers really, but we were coming out of COVID closure and we were feeling siloed and isolated a bit. I think there are some factors at play, but that was surprisingly low to us across the national study, even though in the aggregate it looks really good. Like I said, our project team analyzed those three open-ended questions for anyone who has done qualitative data analysis, you know how long, and maybe sometimes tedious that can feel.
We were really excited to do it even though it did take us a long time. We developed these code books for each of the questions. Looked at a sample of the data and then a team of three of us deliberated and agreed upon and kept testing the code book to make sure we were finding all of these emerging themes, labeling the statements and the responses that participants made accurately and consistently across all three of us, and then dove into these thousands of open-ended responses. This is a table of those emergent themes from, “How does this museum benefit your community?” Again, this is the aggregate data, all 38 of our host museums. And what you might be picking up on here is that what emerged from us analyzing this are the four long-term outcomes. We didn’t go into the data analysis expecting that, but the more we read through it and we’re trying to chunk things together, we’re like, “Oh, this is really similar to what the tool is measuring.”
You can see continued learning and engagement, over a thousand people responded that the benefit of the museum is that continued learning and engagement theme. Next after that is this other category, and this is really hard to present in national data, but we wanted to really reflect it here. That other category is there because we had 38 very unique museums and they each had their own other. We wanted to represent it here. But that other category in the national data includes a whole list of what it means. I think Sabre and Jen are going to talk about what it means at their sites. But for instance, we had the Cradle of Aviation Museum in Long Island. They are on a former Air Force base, and participants are not mentioning the artifacts that they had there or the history that they were preserving, but the history of the space itself, what that area meant to the community at a time and place. And that felt really different than anything else that we were seeing in the data.
That’s just one example that it was very specific to each museum what we meant by that other. And then you can see those responses the number of times things like valuing diverse communities came up. In that code, we did include not just cultural diversity, racial diverse diversity, but also age diversity, anytime they mentioned different ages and the ages they were seeing at the museum and then the strengthened relationships and increased health and wellbeing.
All right. Sabre, do you want to talk a little bit about your data?
I would love to, Michelle. This is my favorite subject. We were very excited, of course, about our quantitative data, but the qualitative data, I just loved reading all of the comments and especially this one, “How does this museum benefit your community?” Ours, of course, scored high again in continued learning and engagement and our other category, as Michelle mentioned, ours was unique in the way that we had mostly economic statements. The Carter County Museum is the primary driver of tourism to the area where we live. Again, very rural. There’s only 400 people in the town, 1700 people in the county, which is 3,300 square miles. We’re two hours from the nearest Walmart. People really make it a point to come to the museum and then use that as a base to discover the Black Hills and discover the history of the area. It was very nice for me to see that economic impact that I had been tracking in other surveys also reflected here, and to be able to tell the county commissioners that was in there as well as the social impact.
We were also very excited by this question, “How does our museum benefit your community?” Because we’ve always had great numbers. We’ve always had 50,000 visitors and 10,000 students, but we never quite could measure well the impact, what is the benefit of our museum to the community? Our other in this case, which you can see is the highest on our scale, has to do with preservation. The Molly Brown House Museum is run and owned by Historic Denver, which is a private nonprofit for historic preservation here in the city of Denver. It’s amazing that the museum is fulfilling that mission of preserving history, really showing the importance of place-based learning and the importance that buildings have in our history as a city. And it’s clearly being recognized by the people who have come to our museum.
And because I always like to see the direct quotes, and I have to say, when you have questions like this, we all know funders really like to see quotes from people. Here just a few of how we parse them into these categories. For learning and engagement from the Molly Brown House, “This museum is an interesting piece of Colorado history, and the museum does a good job of creating unique events to get people excited to learn and visit.” This was not only about that learn piece, but the events, that activity that’s guiding that. Strengthened relationships at Carter County Museum, “It provides a place for community to gather, hold its history and work together to overcome challenges.” And then these other categories, you can see just what they were both mentioning here, at Carter County Museum it, “Benefits the community by attracting tourist dollars.” They see that as a direct line. And then at Molly Brown House, “It’s an important history and cultural time capsule.” I just love that quote so much, tourist destination there as well.
The second question we analyzed was, “How did participating in this study change your perspective of museum/cultural sites?” Again, we looked for emerging themes from the data in the aggregate from that sample, and then what we have presented here. Again, no surprise, places of learning pops to the top. I think we can all agree that’s embedded in our mission recognized as the work museums do. But some of the other categories on here that we were really excited to see emerge was this appreciation. And you’ll see it on here both as appreciation, internal and external. And what those really meant was that participants were responding to just the appreciation they had that museums were there and the internal was the work that goes into it, “I never knew how long or how much work it took to create an exhibit.” They started recognizing that and we were really excited as museum professionals running the study, “Wow, they get it.”
The external was like, “This is just so valuable to have in our community. Our community would not be the same without this.” That was how we parse those is that internal approach is the work, the museum itself and the staff were doing those events and that took time to plan and it really helps and supports our community. This theme of reaffirmed existing feelings. We had a lot of, “It didn’t change. I always loved museums, but this really reaffirmed it for me.” That was really interesting to see that they expanded, “It didn’t change. I always loved them, but this reaffirmed it.” We also saw this come up as, again, the study was happening late COVID times that it’s like, “I visited museums all the time, pre-COVID, this study got me reintegrated into visiting and now I remember why I liked going.” We had a few examples where that really came up and I think that was nice to see if anything that it was that rewarding for the individual participant.
At Carter County Museum, we deliberately try to recruit more of our local community because they don’t attend the museum as often. They see it as something that they grew up with and it never changes. They might come when they bring somebody to it. And we wanted to tell people like, “Hey, this changes. In fact, we have new things happening every week, every month. And yes, 30 years ago you came to the museum, but we were in the basement of the high school and now we have our own building. It’s a little different.” And we had about half of the study was people that were local to the community and the other half were from other places. And what they mentioned in this particular question was that attending the Carter County Museum and participating in this study really made them appreciate their own hometown museums and what they had to offer and look at them differently.
That was really amazing to see. We of course were very excited when we shared our results. We shared them very publicly. Michelle came in on Zoom and presented them to the community in a public forum event that we held because everybody that participated said, “Okay, Sabre, you made me do this. Now what are the results?” And I said, “All right, come to this event.” We also held it on Zoom so people could join us remotely as well. And they were far and above, the staff in particular were very excited to see that appreciation of their work. It’s nice, especially for our front end staff to hear that their efforts are appreciated and that their work in the communities appreciated. And we actually gained a lot of volunteers from this study, people who saw the work that we were doing over those three visits and now want to be a part of it. That was very exciting.
We also recruited from our local community. A lot of our visitors are from out of town, as you saw on the slide, tourist destination was one of the quotes. Also, we have people who say, “I’ve lived in Denver my whole life and I’ve never been there.” And that’s obviously the community that we want to bring in more and more. And I think part of that is that we have been working very hard over say the last five, 10 years to break the mold of a historic house museum. It’s not just a pretty place that you can walk inside. And of course we have beautiful objects, we have stained-glass and it is a beautiful house, but it’s about the story. It’s about the story of Margaret Brown, who she was, what she fought for and her importance in Colorado history, in national history. Really showing how that biography piece to really focus the historic house in the people it lives in is very important to us. And clearly our visitors are appreciating that connection.
Great. We have more example quotes here for you. Places of learning, “It allowed me to appreciate the preservation of history, science, and culture in this part of the world and really allowed me to grasp how much history and culture this area has.” Carter County Museum, again, bringing those people back in. And then as Jen was saying, this appreciation piece, “I see how hard the staff works to create fun and engaging events that really fit well with the home and history.” Just some examples from these two host museums. Along with all of this data that we collected, we did collect demographics of study participants. Again, this is a slide that shows the aggregate data. All 2000, these questions were optional, that does not mean everyone responded to them. For the whole study, we found that 75% of the survey takers, were not members of the museum.
I think for some of our museums that was really exciting. They really purposely tried to recruit non-members and first time visitors into the study. The age category that had the largest representation, 31% were 35 to 44 years old and then 83% female. This is really high. It was a bit alarming to us as a project team. We reached out and asked them, other national studies and it is higher than say what Susie Wilkening finds in the museum-goer survey that is maybe closer to about 70%. That’s something we just try and keep top of mind. And then of course you can see that big pie chart is the race ethnicity. About 75% identified as white or Caucasian, though we had other races and ethnicities included there. And then these are the demographics for each of the host museums, just again, you can compare it to what you might see.
And thinking about the whole aggregate to a smaller sample. Carter County Museum, 66% female. Probably more on trend with a lot of what our other host museums were seeing individually as well. 50% were 25 to 44 years of age. And then for each of our host museums, we did include this statistic for household income that 25% of participants to Carter County Museum had a household income lower than the city’s median. That was something we parsed out for each of our host museums, but just couldn’t do nationally obviously. And then a large portion identified white Caucasian. And then Molly Brown House again, you can see this looks more like the national data, 85% female, 55%, 35 to 54 years old, just an age category higher than at Carter County. And then 36% with a household income lower than the city’s median, and again, majority white or Caucasian.
We collected this data. We know it’s really important for host museums to have, in some cases it was the only way for them to get demographic data because of capacity to do this otherwise. But we always just say, use it with caution. It’s going to tell you, it’s going to help you interpret your data, why you might see the results you see. But we want to say we recognize there’s a lot more diversity here than the survey or what is represented in these charts. We wanted to include it. We know it can be meaningful data, but use it to interpret the other data from the study. And that’s really what we wanted to focus on was the social impact piece. I’m just going to open it for a few questions to Sabre and Jen about why your museum wanted to participate in this study.
The Carter County Museum, and I’ve already alluded to this several times, wanted to participate because we are a very rural museum and we really take it seriously that we serve rural communities. And I wanted to make sure there was a voice there. In my application and in my many, many emails to Michelle, I said, “Hey, I’m going to be honest, I don’t think we can hit that 100 person recruitment,” because again, the demographics of our community that I mentioned, but also even after COVID, we only had about 5,000 visitors visit through the whole year. It’s still hard to get a hundred out of that. I said, “What do I need to deliver to make this meaningful?” And Michelle said, “60.” And we did. We got our 68, which is super cool.
And that took a lot of staff time, a lot of staff involvement, and the board too was very excited. We had everybody on board. This was our first survey outside of economic surveys that we’d ever done. We were really excited to get at that question of, “Yes, the museum does have an impact on our community, how can we measure that? And then how can we talk to the community about it too? Did you know that museums have an impact on health and wellbeing? We’re not just increasing learning and increasing interest, we’re also helping people make sense of their lives.” It was really cool to be able to put some numbers with that and some fun qualitative data with that too. That’s now on our website. And we’re just really thrilled to be a part of this community too. I’ve met some really amazing museums and so have our staff as a result of this.
Along those same lines, Margaret Brown, as I mentioned, has a long history of social impact in her own way in our history, she fought for workers’ rights for juvenile justice, for the women’s right to vote. In those footsteps, it was very important to us to have an impact on our community and be able to measure that. As Michelle mentioned, there was a large gambit of how big staff was at museums. We are one of the small ones. We have five full-time staff members. I am in charge of evaluation as one of my many jobs as a museum education specialist.
The ability to participate in a national project where we can compare ourselves, but where also there’s a tool that’s already been made, there’s help for when we need to have analysis or things like that is just invaluable to me. I’m still building my evaluation skills, my background’s in teaching. I have some of that assessment, but not pure evaluation skills like a larger museum would have. It’s just been amazing to be able to participate in this ready-made study and I’m super excited for the toolkit to really push forward for other small museums to be able to do this work easily.
Thank you. How do you plan to use your data now that you have it years in the making? What’s the next steps for you both?
Our next step is to continue sharing the data with our community. As I mentioned, Michelle came in a public forum event that had about 40 people or so attend, 10% of the community, pretty cool. And we’re just excited to share it throughout the state, but also to mobilize the other museums in the state to use this toolkit. Yes, we got all this really great data, but we think other museums, especially in Montana, can also have that advantage and we can really be the success story for rural and say, “Hey, we did it. This is what it took to do it.” It was a lot of work both from Michelle and from myself and my team, but it’s something that I really believe anyone can do even with small staff.
I’m the only full-time and we have four other part-time staff at our museum and it’s really something that we’re excited to champion and to give also little talks around, we’re working on an expansion effort for a capital campaign that we’re kicking off this summer, and I’ve been talking to funders using this data and a lot of funders in our local community too. I have several board meetings on the docket for the summer where I’ll go present this data to them and talk to them about what our museum brings to the community and what their help financially means to the community as well and serving their constituents. Our local Southeast Electric Co-Op sent a representative to the initial sharing of the findings and has invited me to their board and other grant funders as well. We’re very excited for that.
We’re continuing to take this work on a more personal level, thinking about a cycle of intentional practice to really create our own impact statement, really go through what outcomes we want to see as a museum, measuring those across the board, not just for our tour visitors, but all of our programs, our education programs, everything that we do. This is a great starting jumping off point of some examples of things that we can use. Of course we’re going to use a lot of those quotes, especially talking about preservation. When we’re part of our Historic Denver side is talking to neighborhoods and talking to lawmakers about the benefits of saving that property in order to reuse it, to adapt it. Obviously not everything is going to become a museum, which is fine, but we still need to maintain the buildings and the history that we have so we can reuse them in different ways.
And of course, building membership, donors and all of those sorts of things as well. We’re also hoping, we’re really presenting this and pushing it to ASLH to try to even have more historic houses involved so that we’re a unique type of museum. We’re very different than science and art. We have a lot of those components in our museum too, but it’s very different. Really thinking about what are the impact of historic houses and historic sites to really promote the importance of them in this country and the world in general?
Great, thank you both. We’re going to pause for a moment for questions with the realization that we do have an activity. We’re going to maybe just take five or 10 minutes. Awesome. Emily, are you going to be our mic runner? Awesome. Questions for either about the project. I see someone in the back to my left, Emily.
Good morning. Thank you. This is really exciting work and I’m personally very excited to introduce this and I can’t wait for a toolkit because this model of recruiting in advance of the survey, I’d never really thought about, we’ve talked about the importance of doing survey at my museum, but the idea of having to introduce it at the front desk and explain it when you have one person juggling everything, it’s a lot. This sounds very exciting. I am curious about your recruitment methods for getting people to participate in the survey and how do you balance, especially when you’re a small museum, you have that core of people that do everything you ever ask, how do you get people that have never heard of you, haven’t been there in the 30 years, what are your methods for building a balanced sample?
That’s a great question because we weren’t as personal as the Carter County Museum, which was amazing that you could do that. And we did, we posted in our general public newsletter, we posted in our membership newsletter, we talked to our homeschool families who they come for specific programming, but they don’t necessarily come for the big tour experience. And that was something we thought a lot about because those are obviously audiences we have already reached. That’s why we have their email addresses. I believe we also posted, you all probably do this, but you post in the bathroom of the museum as well for your events and things like that. And I believe we did a press release as well, try to get some that way. But that is something that is a constant question. How are we getting new people to the museum? And I think if we maybe do this again down the road in a few years, that’s something that we’re going to have to think about of getting outside audiences and how we recruit those outside audiences.
We used a combination of efforts. The first thing that we did was post everywhere in the museum. We released a press release in a newsletter and then we hosted a recruitment effort at a community event. For us, the survey started recruiting people in about November, and that is during our annual holiday bazaar. I was there with my laptop and my signup sheet, which we learned later. I needed my own signup sheet for the Carter County Museum. We had a lot of people that either did not have access to technology or did not want to deal with the signup or working with that. We actually worked really hard with our staff and with people and helping people to sign up. We provided a dedicated computer for that, both at that event and then throughout the recruitment period so that people could come and ask us how to fill out the form.
We also did that on the other end with the survey evaluation where people would come in, “I don’t like computers, I don’t want to deal with computers, but I do want to fill out your survey and help the museum.” We would read the survey and then answer, they could see it and answer the comments there and they would dictate their thoughts as well. Another thing that we did was we emailed, we have what’s called the Annual Dino Shindig and participants sign up for that ahead of time in January of the event, of the year of the event, which happens in July. And we emailed them a little bit about the study and really framed it in a aspect of, “This is how you can help this museum that you’ve developed a relationship with. Sometimes Shindig had been going for 10 years, so after a decade of being at the Shindig, you want to know how you can give back? This is how.”
And we had a lot of people sign up for that and that’s what gave us a lot of the diversity too, and people that had not been from the area that were from different backgrounds, cultural backgrounds as well. And we’re really fortunate to have that input. And thanks again to Michelle. Those people couldn’t come until July. They extended their visits so they could make those three efforts, but we were allowed to have a longer survey study as a result to make that accessible for us.
And then I’ll just add two from some other museums who aren’t here. One used a friend’s, a friends approach, they asked their members to be like, “You can’t participate but invite a friend to participate.” They leveraged who was already in their network. And then another museum used community partners, people they had really deep relationships with already. That was an important thing. They asked them to put that call for recruitment out through those partner newsletters. And that was really successful for first time visitors. Emily, I lost you in the crowd, I don’t know where you are for a mic.
This is more of a general question, but did any of the museums that participated offer the survey in a language other than English?
They did. This was something that we were really responsive to. For anybody looking to do a national study at that as a budget line, we did not. The state of Utah really graciously paid to have all materials translated and we did that in response to requests from host museums. We did not go in anticipating it, but we did translate all of the recruitment, all of emails and anything we communicated with participants as well as that social impact survey into Spanish. And then that was the primary and then simplified Chinese. And then we did recruitment methods in Marshallese and Somali. And again, that was based on communities that some of our host museums identified, but then we tracked it and we didn’t have participants who had completed the recruitment form in those languages. We didn’t do the end survey in them, but we were very willing to translate based on need.
Hi, thanks for this work. I think it’s very important for museums to get into this. I just have a slight concern. It’s actually linked to the previous assumptions that I feel the survey really needs to talk to people who have not been exposed to the museum beforehand to avoid the confirmation bias, et cetera, that is also indicated in your data of people saying they have confirmed or they haven’t changed. Shouldn’t that be a requirement almost? And also an opportunity actually to recruit people into the museum if you advertise, give a free ticket for example, for anyone who hasn’t visited the museum before. And this maybe a second question, what are we controlling against? Is it if you send people to a theater or send people… I know that gets very complex now because now for small museums it’s difficult to pull off. But I mean what other activities do we compare to? Or you always want to measure some change. Do you compare one exhibit to the other for example?
Very good questions. We did ask in the recruitment how many times they had visited the museum. And again, because our timing I think we said in 2019, and we can in some cases backtrack the social impact survey data if they opted into an incentive, which we didn’t share about here, but that was where we could get their contact information, go back to their application and see how many times they had. That was a way we could pair it. We didn’t do that level of analysis, but we could. We did ask that in the recruitment and then I think we asked if they were members or not in the social impact survey itself at end of experience, we could gauge again that level of participation. But we’re also very aware that people live lives outside of going to a museum over eight months time.
We can’t control for that. The impacts that they may be receiving aren’t solely because of this museum. It may be because of other things they’re experiencing throughout the course of the study. But that was something that we really were explicit about. We talked to our advisory committee about and we just can’t control for that. We just say that is what it is. In the survey we did try and be really explicit in the language of before visiting this museum and after to try and get them in that frame of mind of we’re asking about your museum visits, but again, there’s no way we can parse out all of those life experiences. It’s a great question and definitely worth continuing a conversation.
Did you, and if so, how measure significance in the changes from your retrospective questions to your current questions?
We used a T-test. Exporting all the data you have all of the pre responses and those are create a mean for those and then a mean for all of the post. And then the T-test compares the difference between that so that then get shoots out the P-value for anyone who’s statistical, it’s not my necessary area of expertise, but it shoots out that P-value and then that 0.05. Anything that was below that, we saw a significant change. Jen alluded to this for some of those that didn’t necessarily show statistical change, we really talked through with each of our host museums saying, “That doesn’t mean there wasn’t a change. That doesn’t mean that one or five or 10 of your individuals didn’t mark a four before and a five after, but it just wasn’t enough of a change for that statistical test to pick up.”
We did, especially with museums with smaller sample sizes, say “Go through that data and parse that out and maybe look more at percentages like X percent had an increase.” That was another way we’re looking for the change, even if we couldn’t pull it out with a statistical test.
And then just a few final things. Here’s all of our contact if you want to reach out to any of us, please do. We do have some surveys. Our session today was really generously sponsored by the Wallace Foundation with AAM, and we just want to make sure we capture some of your takeaways from that. On your way out, if you haven’t already done that little half page survey we and they would greatly appreciate your participation in that. And then Sabre, do you want to share about the blogs of the project?
Yes. Throughout the course of this study project, we collaborated with AAM and other museums on several blogs that really delve into the questions of recruitment, retention, our experiences, and what we want to do with the data. Any of those questions that you asked if you want to learn about other perspectives, just go to AAM and search social impact, especially in the blog area.
Awesome. And we do have time together, if you have other questions, we’ll just give it a few minutes. You can raise your hand and maybe one of you can run with a mic. Do we have any questions from the audience?
If I missed this, I’m sorry, but one of those slides had negative as one of the things that happened after participating, I wondered what that means.
Good catch. We did have a few negatively worded statements on the survey. For instance, “I feel the stress of life,” that is not one that we wanted to see increase. We obviously wanted to have a pre-core and then a post score lower. How that’s handled is when they rate it, we reverse code it in the data. We just switch the order so that the stat goes in the right direction, if that makes sense. Yes, the template to do that statistical analysis corrects the negative to a positive.
This might not be that much of a question, but what I’m thinking, I really love this activity of taking a look at these different steps and I feel like something to establish like a community of practice with museums that sign up would be really cool. I would love to keep chatting with people.
Can I take this one?
Definitely. Go for it. Emily.
Michelle and I in November submitted another request to IMLS to continue funding our museum social impact works. That includes establishing a community of practice and pairing a new incoming cohort of museums with our alumna, if you will, from this previous cohort. Great idea. And I think there are lots of interested and engaged individuals on that level, whether or not the project is funded. Definitely keep your eye out for that.
Fingers cross that gets funded. Even if it doesn’t, our team just met with several of our host museums of when on Wednesday, and we are going to collect interest, which I think all of them would be because they’re amazing museum professionals. When we publish a toolkit, if we can also share their contact information as somebody in your region that you could reach out to for help and support. I think a lot of them were amenable to that idea. Even if we can’t establish it, that you would have access to somebody who could support you and start to develop that community too. We’ll try one way or the other.
I’m doing a visitor experience survey at the moment at Planet Word, which is a linguistics museum in DC. And my question is, are you doing anything, any surveys to try to improve the museum for visitors, which is what my survey is about?
That wasn’t the purpose of this particular study. We were really focused on the social impact dimension specifically. That wasn’t our goal, but come up and we can chat in other ways to find some of that information that might be of use to you. We do have several audience research and evaluation professionals in this space that can maybe point you in some right directions. Thank you all so much for coming.