Saturday, May 19, 2018

Reality Modification


For those who are interested in computational social thermodynamics, the discussion is getting interesting. There's a lot going on at the TED stage this year.

A guru of sorts for all things internet, Jaron Larnier has some opinions about the social implications of a highly-automated social ecosystem:

"In the beginning it was cute but as computers became more efficient and algorithms got better, it can no longer be called advertising any more - it has turned into behaviour modification." -BBC

Isn't all advertising a form of behavior modification? It tries to get you to buy something. In this case, we should not get confused by the two kinds of advertising going on when we use a service like Facebook or Google.

We are being shown things we can pay money to get, be they food-delivery or sneakers. These third-parties are advertising to us via a web service about their product. As much as this might modify our behavior, we don't see it as behavior modification until we go to buy the thing advertised. Then it's behavior modification. Before then, I'm not sure what to call it. Mind-control.

There is another kind of advertising, however, and it is the less obvious one. This is the advertising of the service via the service itself. This is the form of advertising that we don't even notice, and that's what makes it have that potential for doing bad things for society.

Any "free" web service must have built into it a system by which the users' behaviors are modified to increase the chance of their using the service again. If the purpose of a service is ostensibly to help people connect with each other, then guess what - that system is going to work to put you in the way of people most like you, because that's who you're more likely to connect with.

Over the large scale, this defeats the underlying nature of social networks that keeps a superentity like Facebook alive (because of one big web, we get a bunch of separate little webs). But on a small enough scale, it keeps you hooked up and tapped into the people you are most likey to interact with. It shows you pictures like the ones you've already liked, because you're more likely to like those. It tells you about people who think and talk like you, because you're more likely to like what they say if it's similar to what you already say, hence you use the program more. (Nowadays this is called an Echo Chamber or a Filter Bubble and it's become a pretty  common idea.)

Problem for us people is that this is not how society works. You can't only socialize with people like you. This is where the echo chamber or filter bubble comes from. The network gets chiseled finer and finer until it's basically a mirror of your self (although we should note, this is your outward self, the one that lives out there in society, not the inner self...but which self it the real one, I'm not here to dedice that). This is where, across much greater scales, polarization of a society comes from. Less grey area, less room for debate, more need for absolutes. And that means less reality, because reality is anything but absolute.

Back to the point Jaron Lanier was trying to make, maybe reality-modification is what we should be discussing here, not behavior modification.


Post Script:
Far out, I'm thinking we will eventually refuse to live in each other's worlds. We will be forced to run simulations of our own preferred realities and make them compete with each other for the primary shared reality. Kind of an idea like 'all news is fake news unless it's your news'. Ultimately, we're already doing this. Watch it unfold.

Notes:
Facebook and Google need ad-free options says Jaron Lanier

No comments:

Post a Comment