What if you could hear what characters in a play were thinking, as well as what they were saying? How would that change how playwrights create and audiences experience performance?
While my main futurist focus is museums, I keep my eye on the rest of the cultural sector as well. Two of my most interesting engagements were with the League of American Orchestras, and Opera America, envisioning how the performing arts might change in coming decades.
So I was delighted to discover Ghosts, Toast and The Things Unsaid, a play with “digital at its core.” It is one of the first cadre of projects incubating in the Australian Centre for the Moving Image’s ACMIx accelerator program. Ghosts intrigues me because:
- What a participant hears depends on which actor he/she is looking at, which gives each member of the audience control over how they navigate the script. It’s kind of a tech-amplified version of “promenade” theater in which audience members wander through a space, seeing controlling the order in which they see segments of the performance. (PunchDrunk theatre’s Sleep No More, an adaptation of Macbeth set in a large warehouse in Brooklyn, is one famous example of this interactive format.)
- Ghosts takes advantage of the gadgets in a mobile phone (gyroscope, accelerometer) in a way that doesn’t involve staring at the device itself. So unlike many augmented reality experiences (including most AR museum apps) it directs the users attention directly out into the world, rather than funneling it through a tiny screen.
This made me think about the potential for adapting this approach to the museum visit. What you could “read the mind” of people in portraits, or the animals in a diorama? The result could be funny, informative and engaging. I think it could focus people’s attention on what they were seeing, and promote a social experience around comparing what they heard.
Take a look at this video [4 min 17 sec] about Ghosts, Toast and The Things Unsaid and see what you think.