Skip to content

Museums and AI: Is There a Ghost in the Machine?

Category: Alliance Blog
A smartphone sits on a flat surface. On the screen is a drawing of a single eye, an illustration of the privacy issues that modern technology can pose.
Photo credit: Book Catalog on / CC BY 2.0

What are the ethical implications of AI? What role do museums play in the global conversation about data privacy? How can bias impact AI systems?

This post, one of three on the topic of artificial intelligence in museums, explores those questions, with information pulled from a live convening I attended.

The Knight Foundation and the American Alliance of Museums hosted this series of convenings about technology topics at museums around the country this fall. Rob Stein, AAM’s Executive Vice President & Chief Program Officer, approached the AAM Media and Technology Professional Network to help conceptualize these convenings. After much discussion, the M&T working group identified two large areas of focus: Experience/Immersion, and Data/Artificial Intelligence. The convening I cover in these posts, on the theme of Museums and New Intelligences, took place in November at the Pérez Art Museum in Miami.

The first post was an overview of AI and museums. The second post offered a few thought exercises museum professionals can use to explore the future of AI. This final post deals with the ethical issues of AI and how those issues pertain to museums. Privacy is a particular concern to all organizations that collect data, and one that the field has yet to handle systematically.


Skip over related stories to continue reading article

Artificial intelligence (AI), the use of computer technology to analyze large quantities of data, has already transformed human existence. Amazon knows your taste as well as it does thanks to AI. Facebook has the unfortunate ability to tag your face in your friend’s unflattering pictures thanks to AI. Email is basically free of SPAM, again, thanks to AI. Productivity and our contemporary way of life are indebted to AI.

The technological mediation in our society is generally so seamless as to become basically unremarkable. However, our general ignorance about these systems means that we can be inadvertently party to ethical and personally detrimental actions. Museums already use AI, but the practice will only continue to expand. In order to ethically accomplish their missions, museum professionals need to be aware of the ethical and personal implications of AI.

AI isn’t Machines against People; It’s People Using Machines

The term artificial intelligence obfuscates the inherently human nature of the technology. It is easy to think of AI systems as being thoughtful machines, but that is a fallacy. People create the data. People choose the data to aggregate. People tell the machines what do with the data. Ignoring the relationship between human action and AI has important downsides.

As Surya Mattu, Research Scientist at the Center for Civic Media at the MIT Media Lab and a resident at Eyebeam, reminded those gathered at the Perez in November:

Mattu offered an example that is particularly relevant to museums. Most content-focused museum staff have at least once availed themselves of the ability to do full-text searching on Google Books. The service can make research easier. But those free books have a social cost.  Google has a number of classifications for workers on their campus. Along with the staff who have the benefit of infinite vacation, office slides, and special snacks, there are a group of lower paid people who receive no such benefits for scanning books. While hierarchical systems and lower pay for unskilled labor are certainly not news in the current American corporate landscape, the issue for the book scanners is their invisibility. Books appear in searches on Google as if by magic, when in fact a group of humans works for low pay to make those texts available globally.

Labor challenges aren’t the only systemic social problems replicated within AI. Biases like racism are inherent to everything people produce, which includes technology. As Mattu said, “Bias isn’t a bug; it’s a feature.” Without careful thought, the racism of society will remain inherent to all cultural products. The most classic example of racism in AI is when machine vision is used to make assessments about humans. Machines replicate the same poor judgments about people of color as humans because the algorithms are built without scrutinizing the engineer’s inherent biases. This example of bias is particularly salient for museum professionals if they plan to use machine vision to analyze images. The same faults that might cause problems with machine vision for photographs of people of color can create problems when analyzing images with people of color.

Every person who uses data has a part in ensuring that the data is produced ethically and that bias is mitigated. This requires that all of us increase our understanding of data creation and analysis. This type of data literacy does not require actually crunching numbers or writing algorithms. Instead, as Mattu suggests:

People are in charge of machines. We have a duty to ensure that these machines are working in ways that are not detrimental to people.

Museum professionals can’t outsource the work of facing down potential biases in their systems. Fear or ignorance will not save museums from the potential ghosts in the machines. They need to face the potential problems with their data head-on.

The Falsehood of Privacy

Before we get into fears around AI and privacy, it’s worth examining the concept of privacy itself. Most scholars working on material culture prior to the industrial revolution understand that privacy is a social construct. Even non-scholars have some sense of the falsehood of privacy, whether they consciously realize it or not. Do you remember a time when you didn’t think about your picture being tagged on social media or captured without your consent? These cultural shifts highlight the elasticity and constructed nature of the concept of privacy.

This elasticity is particularly important when thinking of AI. We are at a cultural moment when our personal data is owned by corporations and inadvertently exposed without our consent. Most people are unaware of the level of personal data available broadly. Rather than scrutinize their risk, people have created a convenient working definition of privacy that ignores the frequent exposure of their data. For example, as Mattu noted, smart devices, like smart toothbrushes, leak usage data.

Privacy might seem easy for museums, who are already well-versed with maintaining privacy and security over collection and donor records. But most museums have yet to really address the issues of privacy for their data holistically. Most museums keep personal data on patrons, like members and donors. Some museums also track the movements of visitors using their apps, often keeping anonymized data to make decisions. Museum websites might also harvest personal data.

The European Union has helped move some museums to consider the issues of data privacy. As of 2018, the General Data Protection Regulation (EU) 2016/679 (“GDPR”) aims to give control to individuals over their personal data. Some American museums are in the process of understanding how GDPR will impact their websites. This regulation could be seen as a provocation to American museums to serve as a national leader on this issue.

Museums hold history in trust for our visitors. By working together to come up with the best ways to deal with visitor data, museums can show they also hold the visitors’ best interests in trust. This would require museums to find ways to communicate what data they keep and why they keep it, to anonymize data as often as possible, and to then secure the personal information that patrons choose to allow museums to keep.

To learn more about this event, please see: Museums and New Intelligences.

AAM Member-Only Content

AAM Members get exclusive access to premium digital content including:

  • Featured articles from Museum magazine
  • Access to more than 1,500 resource listings from the Resource Center
  • Tools, reports, and templates for equipping your work in museums
Log In

We're Sorry

Your current membership level does not allow you to access this content.

Upgrade Your Membership


Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to Field Notes!

Packed with stories and insights for museum people, Field Notes is delivered to your inbox every Monday. Once you've completed the form below, confirm your subscription in the email sent to you.

If you are a current AAM member, please sign-up using the email address associated with your account.

Are you a museum professional?

Are you a current AAM member?

Success! Now check your email to confirm your subscription, and please add to your safe sender list.