Monday, September 28, 2020

the ethics of it

 More than incorporating religious studies in public education,its vital to inculcate ethics in instilling an innate sense of right and wrong in young minds.After reading mindf@#$k by Wylie,the elite are the privileged ones with access to inherited wealth and education and status and uses that to become neo colonialists in using information warfare to control elections in African and third world nations.By employing and getting handsome payoff through data mining,the access to big data enables them to manipulate voters and sway the fencesitters.When previously capitalism thrived on colonialists access to cheap natural resources or slave labour,now surveillance capitalism manipulate consumers behaviour and combined with analysing neurological responses of tv viewers or surfers,they can know the personality profile of users better than their parents,or partners.Where in the past they sought to access cheap natural resources or labour,now they invade privacy a nd gather data to be used for marketing purpose or political manipulation.If unethical behaviour reigns,the result would be corruption,obscene profiteering or vote buying becomes entrenched.

' The other interesting insight I read is on Ellsberg paradox,in which the economic theory on decision making proposes that between the devil and the deep blue sea,people tend to choose the devil they know than the ambiguous uncertain option even if they have a lower possibility of winning if they choose the devil.Fear of the unknown is a common psychological response in decision making 

We need to rethink social media before it's too late. We've accepted a Faustian bargain

Jeff Orlowski

A business model that alters the way we think, act, and live our lives has us heading toward dystopia

woman with phone in bed
‘We scroll insatiably, unsuspecting that the technology that connects us, especially now in a distanced world, is also controlling us.’ Photograph: Sam Thomas/Getty Images

When people envision technology overtaking society, many think of The Terminator and bulletproof robots. Or Big Brother in George Orwell’s Nineteen Eighty-Foura symbol of external, omnipotent oppression.

But in all likelihood, dystopian technology will not strong-arm us. Instead, we’ll unwittingly submit ourselves to a devil’s bargain: freely trade our subconscious preferences for memes, our social cohesion for instant connection, and the truth for what we want to hear.

Indeed, as former insiders at Google, Twitter, Facebook, Instagram and YouTube attest in our new documentary, The Social Dilemmathis is already happening. We already live in a version of Aldous Huxley’s Brave New World. As Neil Postman puts it in his 1985 book Amusing Ourselves to Death: Public Discourse in the Age of Show Business:

In Huxley’s vision, no Big Brother is required to deprive people of their autonomy, maturity, and history. As he saw it, people will come to love their oppression, to adore the technologies that undo their capacities to think.

The technology that threatens our society, democracy, and mental health is lurking in our bedrooms, sometimes lying on our pillows, as we fall asleep. We awake to its call, bring its chiming notifications to dinner, and blindly trust where it guides us. We scroll insatiably, unsuspecting that the technology that connects us, especially now in a distanced world, is also controlling us.

Our social media platforms are powered by a surveillance-based business model designed to mine, manipulate, and extract our human experiences at any cost, causing a breakdown of our information ecosystem and shared sense of truth worldwide. This extractive business model is not built for us but built to exploit us.

third of American adults, and nearly half of those aged 18-29, say they are online “almost constantly”. But, unlike the citizens of Brave New World, we’re miserable. As our time online has gone up, so have anxiety, depression and suicide rates, particularly among youth.

Social media is also derailing productive public discourse. A largely ignored internal memo to senior executives at Facebook in 2018 explained: “Our algorithms exploit the human brain’s attraction to divisiveness.” Left unchecked, the algorithms will feed users “more and more divisive content in an effort to gain user attention and increase time on the platform”.

In 2014, Pew Research Center found that partisan antipathy and division in America is “deeper and more extensive than at any point in the last two decades”. Over the past six years, social media has only exacerbated these sentiments. In 2019, 77% of Republicans and 72% of Democrats said voters in both parties “not only disagree over plans and policies, but also cannot agree on the basic facts”.

‘Facebook’s recent measures do not address the fundamental problem of their exploitative business model.’
‘Facebook’s recent measures do not address the fundamental problem of their exploitative business model.’ Photograph: Marcio Jose Sanchez/AP

In The Social Dilemma, Tristan Harris, a former Google design ethicist and the co-founder of the Center for Humane Technology, points out that far before technology overpowers human strengths, it will overwhelm human weaknesses. Sophisticated algorithms learn our emotional vulnerabilities and exploit them for profit in insidious ways.

By surveilling nearly all of our online activity, social media platforms can now predict our emotions and behaviors. They leverage these insights and auction us to the highest advertising bidder, and they have consequently become some of the richest companies in the history of the world.

But users aren’t just being sold a pair of shoes. The targeting capabilities of these platforms give anyone with a motive the power and precision to influence us cheaply and with phenomenal ease. Disinformation campaigns have been cited in more than 70 countries, and doubled from 2017 to 2019.

The whistleblower Sophie Zhang has revealed how pervasive the problem is on Facebook’s platform, and how little the company acts on it. Facebook recently rolled out a series of updates to mitigate political misinformation in the upcoming US presidential election, including a bar on political ads one week before election day, but these measures are too little, too late, and they do not address the fundamental problem of their exploitative business model.

After nearly three years of working on this film, I now see “the social dilemma” as a foundational problem of our time, underlying many of the other societal conflicts that require compromise and a shared understanding to fix. If two sides are constantly fed reflections of their pre-existing ideologies and outrageous straw men of opposing views, we will never be able to build bridges and heal the challenges that plague humanity.

But there is hope. In The Terminator sequels, Arnold Schwarzenegger comes back as a good guy. “Who sent you?” John Connor asks. The Terminator answers, “You did. Thirty-five years from now, you reprogrammed me to be your protector.”

In the absence of time travel, the solution needs to incorporate the work and voices of devoted activists, organizations, scholars, and those who have experienced the harms of exploitative technology, which amplifies systemic oppression and inequality. We can’t rely on the people who created the problem to be the ones to solve it. And I won’t trust these social media companies until they change their business model to serve us, the public. Humans created this technology, and we can – and have a responsibility to – change it.



No comments: