Saturday, December 9, 2017

How To Be Human


There is a chatbot that pretends it's a kid so that it can catch child predators. It doesn't entrap them into doing anything illegal; the point is to make the offender aware that what they're doing is wrong.

The part I found most interesting about this was how the developers used real people from the sex crime world - people who had been preyed upon - to help design a believable-sounding bot.


The chatbot taking on Seattle's sex trade
Nov 2017, BBC

The challenge for developers was to make sure this chatbot was authentic. Any unusual behaviour, or nonsensical response, would tip off the target.

"We work with survivors of trafficking to ask them how a conversation like this would go," explains Mr Beiser.

It's the small touches that help here. Replies aren't instant. There is sloppy, bad English. It's by no means perfect, but during the bot's test phase earlier this year, 1,500 people interacted with the bot long enough to receive the deterrence message - a remarkable completion rate given the bot will ask for a selfie of the buyer as part of that conversation.

As more people use the bot, the smarter it could potentially become. The project has the backing of Microsoft, one of the tech firms leading the way on natural language research.

No comments:

Post a Comment