friday-rant
Social Networks

Rant: It Was YOU That Ticked Facebook's Box

I should begin this rant with the disclosure that I do not have a Facebook account and have never had one. Conversations in recent years with various friends and acquaintances have therefore gone something like this:

"Are you on Facebook?" (rhetorical question, asking for my account name)

"No."

"Oh. Why?" (look of bewilderment)

"Because I've read the user agreement."

"What?" (look of incredulity and mild disgust)

"Never mind. Here's my email address instead."

“OK. Bye.”

These days nobody asks, of course, since Facebook has become the prime addiction of the Western world. You're either on it or you don't exist. I skulk through the shadows, subsisting on the meagre crumbs of social interaction that fall from Facebook's table, eschewing a glowing screen for face-to-face contact whenever I can, having to actually talk to people to find out about their lives and what they're thinking. It's tough, I can tell you.

So it would be easy to make this an "I told you so" rant, relating how smug I feel about the predictable twists and turns the vast social networking experiment is taking, but that would be shallow and unhelpful. And I don't feel smug.

The simple fact of the matter is I've lost out on better contact with friends and acquaintances over the past six years or so because I've refused to tick the box marked "Please take all my personal information and sell it or process it for your own nefarious financial and manipulative ends."

Or something along those lines: I forget the exact wording. So I'm not smug and I haven't won. I've lost. With that out of the way, on with the rant.

In the years before psychology developed a code of ethics and began calling the people who took part in its experiments 'participants' instead of 'subjects', it was an ugly discipline at times. Milgram persuaded volunteers to electrocute people they'd never met, just because someone in a white coat told them to, while Zimbardo's Stanford prison experiment had to be cut short when the ordinary people who volunteered as prison guards started taking their roles a little too seriously.

We learned a lot about ourselves from these experiments, which is why they stick in the mind. Unfortunately the way in which we learned it was unethical and caused significant trauma and distress to the 'subjects'. So a code of ethics was devised, informed consent was required wherever possible and a new era began, using participants.

In universities, psychology experiments now have to be run past an ethics panel before they're permitted to be carried out on people. The change in approach has been little short of revolutionary. And it worked: we're still learning, albeit in a different direction and perhaps at a slower pace.

Facebook is a company, not a university, and it can do whatever it likes with your data, including changing what you read in order to gauge your emotions and how they are influenced by those around you. Because you ticked that box.

Oh, hang on though, you didn't. It appears that the existing user agreement didn't really cover such action, so Facebook changed its user agreement after the research had already been carried out. I quote from Forbes: “It turns out that Facebook conducted their research four months before adding ‘research’ to their data use policy.”

Oops. Regardless of that minor detail, Facebook's data is ripe for research, enough to make most psychologists drool. The company has recently been involved in research into face recognition, among other interesting topics. It is clearly using your data to learn more about you and make money from you in whatever ways it sees fit.

Well of course it is. It's a business. To do anything else would be dumb. And what are you going to do about it? I suppose you could post a complaint on your wall, or maybe send an enraged tweet. That'll help, I'm sure. Remember, you ticked the box.

Facebook doesn't have users, it has subjects, and those subjects have willingly – but I suspect for the most part unwittingly – agreed to exchange control over their personal information, including information that they read, for the ability to more closely connect with people (and to post selfies and smugshots designed to make everyone jealous).

Obviously I'm in the minority here but I don't think that was a fair exchange based on informed consent. If Facebook had announced its 'research' intentions clearly and in detail prior to people joining, making clear that it might manipulate their emotions in order to learn more about them, would they still have been so keen to sign up?

Yes, probably. That's human nature for you.

 

Freelance technology journalist Alex Cruickshank grew up in England and emigrated to New Zealand several years ago, where he runs his own writing business called Ministry of Prose.

PREVIOUS ARTICLE

« News Roundup: Feeling Facebook, Google Cuba and NSA Headlines

NEXT ARTICLE

Football Needs To Kick Out the Fax for Transfers »
author_image
Alex Cruickshank

Alex Cruickshank has been writing about technology and business since 1994. He has lived in various far-flung places around the world and is now based in Berlin.  

  • Mail

Recommended for You

Trump hits partial pause on Huawei ban, but 5G concerns persist

Phil Muncaster reports on China and beyond

FinancialForce profits from PSA investment

Martin Veitch's inside track on today’s tech trends

Future-proofing the Middle East

Keri Allan looks at the latest trends and technologies

Poll

Do you think your smartphone is making you a workaholic?