Sunday, October 8, 2017
Physiodata at Large
Drone detects heartbeat and breathing rates
Sep 2017, BBC
The system detects movements in human faces and necks in order to accurately source heart and breathing rates.
***
In other words, facial recognition algorithms have now gone totally apeshit.
I guess they're just looking at your neck, and reading your pulse that way. Do our faces (our heads really) move in the rhythm of our breathing, so slight that we might not see it, but a robotic eye-brain?
Now that we can get live physiological data from large groups of people, simultaneously, and in realtime, just by looking at them, it's no time to forget that we can read the date on a dime on the sidewalk from a satellite in orbit.
In extrapolation, all I can think about is Kim Stanley Robinson's Aurora (2015), where the multi-generational starship, equipped with a quantum computing AI instead of a captain, and after a civil war on the ship, finally "decides" that in some cases, it's better to let the air out of a biome than to let the people in it do harm to the ship, because, you know, for the greater good. The people don't die, at least most of them; instead they just get really, really tired and docile.
Narrative snippets have the ship dictating the "average pulse rate of the ship," meaning the average of every inhabitant of the ship, data that an AI-equipped starship of the 22nd century can very capably know.
Who's about to riot? Those people with the quickening pulse, that's who. Face-recognition used to yield data on the outside, like your face. Now they can get data from the inside. Maybe "angry faces" is easy to identify, and might be more predictive than pulse. Maybe it's the same things. But something about a drone I can't even see, knowing what's going on inside my body, makes me think we're already living in these science fiction novels.
image: Woody Allen on the couch in his 1977 film Annie Hall, BBC
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment