This Is How You're Being Manipulated

At a preliminary Senate hearing today on the subject of potentially putting legislative limits on the persuasiveness of technology—a diplomatic way of saying the addiction model the internet uses to keep people engaged and clicking—Tristan Harris, the executive director of the Center for Humane Technology, told lawmakers that while rules are important, what needs to come first is public Not an easy task. Algorithms and machine learning are terrifying, confusing, and somehow also boring to think about. However, “one thing I have learned is that if you tell people ‘this is bad for you’, they won’t listen,” Harris stated, “if you tell people ‘this is how you’re being manipulated,’ no one wants to feel manipulated.”

This Is How You're Being Manipulated

Curated via Twitter from Gizmodo’s twitter account….

At a preliminary Senate hearing today on the subject of potentially putting legislative limits on the persuasiveness of technology—a diplomatic way of saying the addiction model the internet uses to keep people engaged and clicking—Tristan Harris, the executive director of the Center for Humane Technology, told lawmakers that while rules are important, what needs to come first is public awareness. Not an easy task.

In other words, the polarization of our society is actually part of the business model. [… ] As recently as just a month ago on YouTube, if you did a map of the top 15 most frequently mentioned verbs or keywords on the recommended videos, they were: hates, debunks, obliterates, destroys [… ] that kind of thing is the background radiation that we’re dosing two billion people with.

They determine where our children go to school, whether someone will receive medicaid benefits, who is sent to jail before trial, which news articles we see, and which job-seekers are offered an interview [… ] they are primarily developed and deployed by a few power companies and therefore shaped by these companies values, incentives, and interests. [… ] While most technology companies promise that their products will lead to broad societal benefits there’s little evidence to support these claims and in fact mounting evidence points to the contrary.

While Harris, Richardson, and Wolfram may be able to rattle off dozens of examples of massive abuse of user trust or cite studies showing the negative impacts these technologies have had on the lives of regular, unsuspecting Americans, it speaks volumes that a sitting Senator would prefer death to the future we’re currently building. Senior reporter. Tech + labor ///
bryan. menegus [at] gizmodo. com
Keybase: keybase. Reminder: J. J.

Imagine a world in which priests only make their money by selling access to the confession booth to someone else, except in this case Facebook listens to two billion peoples’ confessions, has a supercomputer next to them, and is calculating and predicting confessions you’re going to make before you know you’re going to make them.

On my right side there’s crazy town: UFOs, conspiracy theories, Bigfoot, whatever. [… ] if I’m YouTube and I want you to watch more, which direction am I going to send you?

However, “one thing I have learned is that if you tell people ‘this is bad for you’, they won’t listen,” Harris stated, “if you tell people ‘this is how you’re being manipulated,’ no one wants to feel manipulated.

And that avatar, based on all the clicks and likes and everything you ever made—those are like your hair clippings and toenail clippings and nail filings that make the avatar look and act more and more like you—so that inside of a Google server they can simulate more and more possibilities about ‘if I prick you with this video, if I prick you with this video, how long would you stay? ’ And the business model is simply what maximized watch time.

The problem with a lot of these systems is they’re based on data sets which reflect all of our current conditions, which also means any imbalances in our conditions [… ] Amazon’s hiring algorithm was found to have gender-disparate outcomes, and that’s because it was learning from prior hiring practices.

That doesn’t seem so [much like a dark pattern], that’s whats so insidious about it—you’re giving people a way to follow each other’s behavior—but what it actually is doing is an attempt to cause you to come back every day because you want to see ‘do I have more followers than I did yesterday.

If you say ‘I want to delete my Facebook account’ it puts up a screen that says ‘are you sure you want to delete your Facebook account, the following friends will miss you’ and it puts up faces of certain friends.

Link to original article….

Leave a Reply

Leave a comment
%d bloggers like this:
scroll to top