Skip over related stories to continue reading article
The good news about the CFM Big Data session in Seattle is that it was very well attended. Actually, that was the bad news, too, as the room was overflowing and we had to turn folks away. I suppose there are worse problems to have—I am glad people are eager to learn about this topic—but I’m sorry that not everyone who wanted to come could get in. I hope this post redresses that to some extent, as well as sharing the session content with people who didn’t attend the conference.
My fabulous panelists were:
John Lucas, director of Solutions Delivery, Business Solutions, Avnet Services
Big Data because is one of the disruptive forces explored in this year’s CFM TrendsWatch report. Brief summary for those who are not familiar to the topic: The sheer volume of digital data generated and captured in our world is exploding, and as this happens the ability to selectively analyze these huge data sets will become a new source of value and competitive advantage. Museums of all sizes need to understand the implications of big data—how it can help us understand our audiences and their needs, how we can make our operations more efficient and effective, how we can create new value and sources of income by smart use of information, how to navigate the associated privacy concerns.
Big tech companies and major retailers are at the forefront of experimenting with data mining, but some cultural organizations are already in the game as well. Every “touch” point with a visitor—ticket sales, museum store, food service, even engagement with exhibit kiosks or via social media—can generate data that, when combined, creates a richer portrait of how people are using museum facilities. This in turn can affect decisions about programming, marketing, staffing, stocking and communications.
Rob shared how the Dallas Museum of Art (DMA) is implementing data collection and analysis through their DMA Friends program in order to encourage repeat visitation and build long-term relationships. The Friends program invites people to enroll for free membership, presents a menu of participation options, and awards credit and recognition for visitor involvement. In the course of enrolling and participating in the program, members contribute data DMA uses to measure its performance and tailor visitor experiences. Rob discusses this topic in depth in the following papers that he co-authored with Bruce Wyman:
Where DMA developed an in-house system to track and foster visitor engagement, Donna Powell’s organization worked with Avnet and IBM to implement point of sale and control systems to help them make real-time decisions about staffing and operations. You can read more about that in this post Donna wrote for Wired:
The Cultural Data Project was started by Pew Charitable Trusts, and spun off as a separate organization in 2013. It arose from a desire to streamline the application process for foundation funding by providing a central repository for the basic information asked for in such forms, so that nonprofit cultural organizations don’t have to answer the same questions over and over again. Of course, such a consolidated database quickly becomes a powerful research tool as well. CDP now serves more than 14,000 arts and cultural organizations in 12 states and the District of Columbia and, as it expands to national coverage, it will be one of the few big repositories of data about cultural organizations per se. You can read more about this work in the report New Data Directions for the Cultural Landscape: Toward a Better-Informed, Stronger Sector and hear Beth discuss the CDP in the associated webinar.
John was a particularly valuable speaker to nab for this session, as he comes out of our world (he implemented data analytics at the Cincinnati Zoo & Botanical Garden as their director of operations) and now works in the data analytics field. I first learned about his work from reading about IBM’s work with History Colorado (as profiled in this case study). Not every museum is going to be able to build their own analytics system in-house. Our field needs to become intelligent consumers of services such as those provided by Avnet to take advantage of turn-key solutions as they become more sophisticated and affordable, and we need to work with people like John to educator these service providers about our needs.
Both at this session and in side conversations, museum staff puzzled over how to tackle analysis of data that can get so dauntingly big. John’s advice: ”don’t try to “boil the ocean…go after the things that have the highest ROI and the things that have the greatest impact on your customer experience first.” As Rob said, “If you start too big, you will just quit and give up, because you are not smart enough yet to know what you need. So you have to treat it as a process where you and your staff get smarter over time. It almost doesn’t matter what data you’re recording in the process. If…your staff are learning, you will figure it out.”
All the session recordings from the conference, including Big Data are available for purchase. Session handouts are also available for download through June 30—while this session did not have handouts, you can access the Prezi online if you want to queue it up to watch while you listen to the session recording.
As John pointed out, while data analytics have been around a long time, museums are only just beginning to get into the game, fortunately at a time when the tools are getting easier, and less expensive, for non-technical people to use. But we need to mentor each other to speed adoption of big data analytics in our field. If your museum is using data analytics, or wants to start doing so, please weigh in to the comments section below, to make connections with your peers and build a community of practice around this rapidly evolving technological tool.