Informing the Campfire Community every day

You are here

Pete Lawrence - 19 Sep 2017
0

1

"Mr Foer compares tech’s lack of transparency to Italy, “where it’s never entirely clear how power really operates"

 

More than ever now, there is probably a very good case for an 'algorithm-free' social network (thanks for the phrase @Gregory Thompson)

Facebook now reportedly has 60m+ lines of coding setting the algorithms that seek to control not only our daily newsfeed but our emotions too. In his new book 'The Existential Threat of Big Tech' Franklin Foer argues that technology is making our minds redundant and goes on to suggest that although Facebook would never admit it, "algorithms are meant to erode free will, to relieve humans of the burden of choosing, to nudge them in the right direction. Algorithms fuel a sense of omnipotence, the condescending belief that our behaviour can be altered"

As The Economist suggests that Foer's book doesn't really provide answers, more shines a light on the issues and that regulators need to look much more closely at these monolithic tech compnaies who in many ways are ruling our world and dictating to governments whether or not they fancy paying tax  "Franklin Foer’s passionate contribution to the debate about technology firms raises many questions, but settles few of them" 

The questions are urgent and numerous :

  • What are users of social networks actually looking for?
  • Is there an addictive element in the dependence on their feeds?
  • In what ways is the news shaped by what we see within our algorithmic bubble?
  • How does the ability to buy Facebook priority infleunce key political milestones such as elections?

In his Guardian long-read article 'Facebook's war on free will' Foer cuts to the chase :

“Facebook would never put it this way, but algorithms are meant to erode free will, to relieve humans of the burden of choosing, to nudge them in the right direction. Algorithms fuel a sense of omnipotence, the condescending belief that our behaviour can be altered, without our even being aware of the hand guiding us, in a superior direction. That’s always been a danger of the engineering mindset, as it moves beyond its roots in building inanimate stuff and begins to design a more perfect social world. We are the screws and rivets in the grand design.

"Facebook likes to boast about the fact of its experimentation more than the details of the actual experiments themselves. But there are examples that have escaped the confines of its laboratories. We know, for example, that Facebook sought to discover whether emotions are contagious. To conduct this trial, Facebook attempted to manipulate the mental state of its users. For one group, Facebook excised the positive words from the posts in the news feed; for another group, it removed the negative words. Each group, it concluded, wrote posts that echoed the mood of the posts it had reworded. This study was roundly condemned as invasive, but it is not so unusual. As one member of Facebook’s data science team confessed: “Anyone on that team could run a test. They’re always trying to alter people’s behaviour.”

The Economist sets the context for Foer : "A rising figure in the cohort of tech-company critics is Franklin Foer, a journalist at the Atlantic. His new book “World Without Mind” decries society’s capture by big technology companies, mainly Amazon, Facebook and Google. His criticisms are wide-ranging, but centre on the idea that they have become monopolies. Their dominance has gutted the financial health of publishers and music companies. He even charges tech firms with having bruised democracy: they serve up information based on opaque algorithms, suggesting what people should think, and so supplanting individual thought. Mr Foer compares tech’s lack of transparency to Italy, “where it’s never entirely clear how power really operates”.

"But Mr Foer does not want to seem “fuelled by anger”, and he makes a few important points. One is that tech firms exert so much power that people demur from criticising them. Mr Foer saw this first-hand when he became an activist against Amazon’s treatment of authors and publishers. Because the online giant could influence the success of books, many lawyers and publishing executives feared speaking out.

"Mr Foer’s concern about opacity is also spot-on. For example, Facebook and Google are not bound by requirements to report sales of political advertising as traditional media firms are. Recent revelations of Russian ad-buying on Facebook during America’s presidential election underscore the risk of so little oversight.

The questions go deeper. In his paper 'A Tyranny of algorithms' Campfire member @Richard Dent raises more questions about how much sway Facebook has over our political choices:

"What effects are algorithms having on society? Who is accountable if things go wrong? This is a new social relationship between corporations and citizens. What are the boundaries of this relationship? If Facebook can influence the emotions of its users, it could feasibly use its algorithm to influence people towards specific political perspectives or worldviews. Scary stuff. Can Facebook users vote with their feet should these practices go too far?"

 

The Existential Threat of Big Tech by Franklin Foer is published in the UK at the end of September 

 

1 Comments

887

Ralph Pettingill

Thanks @Pete Lawrence I've found the interview podcast really thought provoking ...

More From Pete Lawrence

Campfire
JULIE OLDFIELD @ CAMPOUT Helen Callaghan
ROGER HALLAM @ CAMPOUT Helen Callaghan
FOOD @ CAMPOUT Helen Callaghan
HEART GYM @ CAMPOUT Helen Callaghan
YOGA @ CAMPOUT Helen Callaghan