Sharing a laugh: Scientists teach a robot when to have a sense of humor
Sep 2022, phys.org
In the shared-laughter model, a human initially laughs and the AI system responds with laughter as an empathetic response. This approach required designing three subsystems — one to detect laughter, a second to decide whether to laugh, and a third to choose the type of appropriate laughter.The scientists gathered training data by annotating more than 80 dialogues from speed dating, a social scenario where large groups of people mingle or interact with each other one-on-one for a brief period of time. In this case, the matchmaking marathon involved students from Kyoto University and Erica, teleoperated by several amateur actresses."Our biggest challenge in this work was identifying the actual cases of shared laughter, which isn't easy, because as you know, most laughter is actually not shared at all," Inoue said. "We had to carefully categorize exactly which laughs we could use for our analysis and not just assume that any laugh can be responded to."
via the Graduate School of Informatics, Kyoto University: Can a robot laugh with you?: Shared laughter generation for empathetic spoken dialogue, Frontiers in Robotics and AI (2022). DOI: 10.3389/frobt.2022.933261
Image credit: AI Art - LIFE Magazine Cover - 2022
Engineers Gave a Car a Pair of Eyes to Make Future Roads Safer For Pedestrians
Oct 2022, Science Alert
In one of the more unusual experiments we've seen recently, researchers attached a large pair of cartoonish googly eyes to the front of a small, self-driving vehicle – and it turns out that this kind of anthropomorphic tweak could actually improve pedestrian safety.
Not unusual at all, I've been waiting for this exact thing to happen.
There was a gender split in the results of their virtual reality experiments. For men, the eyes only really helped in dangerous situations, warning them to pause when they might otherwise proceed. For women, the eyes boosted confidence by signaling it was safe to cross.
via University of Tokyo: Can Eyes on a Car Reduce Traffic Accidents? Chia-Ming Chang et al. Proceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. September 2022 Pages 349–359. https://doi.org/10.1145/3543174.3546841
No comments:
Post a Comment