Whenever a new technology starts to take off (drones, 3D printing, Virtual Reality), the initial breathless coverage in the press is inevitably followed by articles deriding the trivial applications of these advances. We’re deluged with cartoons of drones as Peeping Toms, complaints about how long it takes to 3D print anything substantial, and first-person narratives of VR nausea.
Chatbots (interactive conversation programs powered by artificial intelligence) are no exception. The past year has seen the introduction of a number of Chatbots, at least in Beta: Opla, Mitsuku, even a personal accounting chatbot created by Sage. Perhaps the most accessible and affordable is Alexa, Amazon Echo’s virtual-assistant-in-a-cylinder eager to facilitate your life, and your shopping.
As soon as functional chatbots being trialed in public, they were being mocked and hacked. In March Microsoft introduced Tay, a Twitter bot described as an experiment in “conversational understanding.” Designed to learn from interacting with real people, within 24 hours Tay had been targeted by Tweetjerks who taught it to parrot racist, misogynistic remarks. V. funny, eh?
But despite the snark, each of these new technologies has the potential to address real world problems. Drones are being deployed to protect endangered species; in medicine, 3D printers are being used to create prosthetics, drugs and replacement tissues; VR is being used to engender empathy for refugees and the homeless.
For chat bots, one emerging real world application is emotional support. For example, MIT’s Media Lab has launched Koko a social media network designed to enhance emotional well-being. Their goal is that someday companies will be able to license an “empathy API” from Koko for use in their own chat bots. Perhaps an “empathetic” AI, especially one with permission to snoop on social media channels, would be able to spot the warning signs of depression or self-harm and provide some front-line intervention and encourage a troubled individual to seek skilled human help. Koko is testing this concept by partnering with Kik, a chatbot designed for sympathetic reflective listening.
Your Futurist viewing this Friday: watch the video below [2.23 minutes] that previews how Amazon’s Alexa might behave when spliced with Koko’s digital genes:
Here’s your thought question for the week: If you had access to Koko’s empathy API, what would would you plug it into? Who would you want it to talk to, about what, and why? I’d like to hear your ideas.