Skip to content

Taking a Museum Education Study from Research to Practice

Category: Alliance Blog
A docent is seen with a group of children seated on the floor, some of whom point in front of them to an artwork.
A museum education department is refining its docent-led tours with the tools of a research study. Photo credit: Cincinnati Art Museum.

People often talk about going “from research to practice,” but in reality, this is fraught with challenges, not just among museums but in any field. One primary challenge is that research institutions operate in a realm of theories, hypotheses, and slow thinking, while museum practitioners operate in a quickly changing world that requires action. How do researchers communicate their findings to practitioners in a way that is timely, resonant, and actionable?

One researcher-practitioner duo faced this conundrum recently upon completion of a study of the effectiveness of single-visit field trips for students. Stephanie Downey, the director of research firm RK&A, worked with Emily Holtrop, Director of Learning and Interpretation at the Cincinnati Art Museum, on a national study of single-visit programs at art museums for the National Art Education Association’s Museum Education division in partnership with the Association of Art Museum Directors. In sharing the study with practicing museum educators, they were surprised to see that, at least for now, the tools of the research have taken on more immediate influence than the results themselves. In this post, Stephanie and Emily will share how a rather ordinary research tool emerged as a significant influence on museum educator practice.

Stephanie

When a research study is complete, the most broadly shared part is the results, sometimes in the form of a few bullets or a paragraph, and rarely more than an executive summary. This is despite the fact that the study in its entirety may have taken years to complete and included multiple elements and phases. This is understandable, since in the end we do research for the results, but it is sometimes hard for me as a researcher who has lived with a study and all its intricacies. So, having resigned to this reality, I was surprised and quietly thrilled to realize that, at least among the research’s primary audience, museum educators, something other than results was emerging as the star player of the research—one of the instruments used to collect data, the Program Observation Tool and Teaching Behavior Examples.

At the end of a research study, these data collection tools are usually relegated to the appendix of a report, where only “data nerds” typically look for them. Yet they are absolutely critical to executing research with results you can have confidence in. We researchers have a saying, “garbage in, garbage out,” meaning that the data you collect and how you collect it has everything to do with the results that emerge—if your data collection tools are not solid, your results will be flimsy and not hold up under scrutiny. With this in mind, we spent a full year developing and testing all the data collection instruments, all after a planning year of refining our research hypothesis and completing a literature review.

Skip over related stories to continue reading article

The Program Observation Tool was one of several methodologies we employed for the impact study. We also used a rubric-scored student performance assessment, a student questionnaire, a teacher questionnaire, and a teacher and program facilitator interview. This mixed-method approach allowed us to examine single-visit art museum programs from multiple perspectives, and each method played an important role.

The purpose of the Program Observation Tool was to help us contextualize whatever we learned from our data on students in the programs. For instance, if our hypothesis was that students’ creative thinking would be enhanced from a single-visit program, it would help us understand why that was. By observing the strategies employed by the paid and volunteer educators running the program, we could see if there was alignment between what they did and the outcome for students.

We developed the tool to be standardized but still open-ended. Data collectors took handwritten notes during the programs and then immediately completed a guide rating the educators’ teaching. They were required to justify their ratings using examples from the program. In training, we extensively reviewed examples of what teaching behaviors we would classify in support of student capacities, pulling from examples we generated when pretesting the tool the year prior. As I trained these data collectors in Houston, I was proud to see our careful and deliberate development was working. I watched the team talk about what they had seen, referring again and again back to the examples provided in the tool.

The Program Observation Tool, and particularly the Teaching Behavior Examples we developed to help observers make sense of what they saw, turned out to be illuminating for the museum educators we worked with on the study too. While they all live and breathe these programs, seeing what they do laid out in concrete terms brought the core of their practice to the surface. The examples gave them solid descriptions to use when talking about their techniques, which they often develop intuitively. Before the study ended, a core team of museum educator advisors flagged the observation tool as crucial. For this reason, we used the tool in practice exercises during the symposium we held on the study, and we have heard from many museum educators about how they are using it in their work.

Below, Emily will share with you how she has used the tool with her Cincinnati Art Museum Docent Corps.

Emily

Serving as the project director for the NAEA/AAMD Impact Study has been a wonderful supplement to my role as director of learning and interpretation at the Cincinnati Art Museum (CAM). With the framework set out by the study’s research question, I can look critically at the work we do with students at the CAM: Are we developing our students’ capacities in the way we describe in our research question? Are we encouraging them to use their creative and critical thinking skills? Are we boosting human connections and sensorial responses? Are we using the best tool in our docent observations to answer these questions? The answer to the last question, we determined, was most likely not. Luckily, I had the answer to our need in my hands with the RK&A Program Observation Tool and Teaching Behavior Examples.

Our long-running volunteer docent body, the CAM Docent Corps, consists of eager learners who embrace every museum education pedagogy placed in their toolbox. As part of the museum’s rigorous development program, the docents participate in monthly training sessions designed to teach them not just the “what” but the “how” of connecting our collection and our visitors. To learn the “what,” they listen to curators lecture on our permanent collection and special exhibitions. To learn the “how,” they participate in workshops led by museum educators on topics ranging from touring tools to connecting current social and cultural issues to art with visitors. We strive for the methods they’re trained in to be inquiry-based and interactive.

The docents also participate in a rotating three-year observation program, where Learning & Interpretation staff shadow programs, give feedback in extended discussions, and then complete detailed evaluations of the docents. These evaluations help staff identify trends across the corps and plan future trainings on areas that need improvement.

In completing the evaluations, our team has felt for some time that we needed a new observation tool to capture more meaningful feedback. Our previous tool, while useful for obtaining the general tone of the tour, did not allow for in-depth exploration into all aspects of the program that we needed. It did not allow us to be as focused as we needed to be to glean the results desired. So we were very lucky to find the Program Observation Tool from the NAEA/AAMD Impact Study.

Hoping our docents would embrace the new tool as part of their ongoing docent development, we asked a small team from the corps to help us plan a gallery workshop for their fellow volunteer museum educators. The workshop summarized the study’s findings and introduced the tools with practice exercises. The group broke into “participants,” who engaged in open dialogue on a work of art, and “observers,” who used the Observation Tool to evaluate them.

We then had a lively discussion about the tool and what we had witnessed. One area of concern for the docents was that L&I staff expected them to do everything listed in the inventory of techniques. We assured them that we did not, and that the Observation Tool and Teaching Behavior Examples should be seen as a guide of what can be accomplished and not what absolutely needs to be.

Here’s what some of the docents had to say about the study and the observation tool:

“I’m always looking for ways to improve how I organize and lead tours. The Impact Study provided excellent research and guidelines to incorporate their findings in my planning and dialogue with guests. It really helped me prepare for the workshop. My fellow docents’ responses—both my observers and “tour guests”—to the demonstration and their comments showed that they understood the new information and how they could use it on their tours. My tour guests are benefitting from the Impact Study.”

—Helen Rindsberg

“In preparing for the workshop on implementing the Impact Study, I found the concepts incorporated in the research were exactly what I require in my tours. There, in clear and plain language, was a road map for setting up a rewarding experience for both guests and docents. Using the research and the guidelines helped me to prepare for the workshop and are now the model for all my tours. Most rewarding, however, was how our museum docents reacted to these teaching practices. I loved their excitement and involvement. I just can’t say enough good things about this study.”

—Pat Cordes

Moving forward, we are eager to try a version of the Observation Tool for our next cycle of docent observations. We are excited to review the data we get from it and apply that knowledge to even more engaging docent-led tours.

AAM Member-Only Content

AAM Members get exclusive access to premium digital content including:

  • Featured articles from Museum magazine
  • Access to more than 1,500 resource listings from the Resource Center
  • Tools, reports, and templates for equipping your work in museums
Log In

We're Sorry

Your current membership level does not allow you to access this content.

Upgrade Your Membership

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to Field Notes!

Packed with stories and insights for museum people, Field Notes is delivered to your inbox every Monday. Once you've completed the form below, confirm your subscription in the email sent to you.

If you are a current AAM member, please sign-up using the email address associated with your account.

Are you a museum professional?

Are you a current AAM member?

Success! Now check your email to confirm your subscription, and please add communications@aam-us.org to your safe sender list.