[kwlug-disc] More reasons to end Facebook
jason.eckert
jason.eckert at gmail.com
Sat Sep 25 12:02:14 EDT 2021
I gave up Facebook for good 3 years ago and never looked back.That is a great writeup, but what is even more important to note is that Facebook provides us with an unheathy relationship with technology in general, and amplifies/encourages negative traits and behaviour (control freakism, depression, self-image, alienation, dependency, etc.).For those reasons alone, I don't think anyone should use it.Sent from my Samsung device running Android (basically Linux in drag)
-------- Original message --------From: CrankyOldBugger <crankyoldbugger at gmail.com> Date: 2021-09-25 10:04 (GMT-05:00) To: KWLUG discussion <kwlug-disc at kwlug.org> Subject: [kwlug-disc] More reasons to end Facebook While some of us in this group, including myself, swore off Facebook years ago, we all know people who swim in it every day. In this week's email from The Markup (which I highly recommend) they talked of more issues with FB that I thought you guys would like (like as in enjoy, not like as in "remember to Like and Subscribe!"...)The Markup can be found at https://themarkup.org/Honestly, it
was kind of hard to imagine that Facebook’s image could get more
tarnished. After all, the company’s been mired in negative press for the
past five years, ever since the Cambridge Analytica scandal and the Russian disinformation campaign fueled Trump’s election in 2016.But,
amazingly, public relations has gotten even worse for Facebook this
month. The precipitating event was the emergence of a Snowden-style
trove of documents—“The Facebook Files”—that appear to have been leaked to The Wall Street Journal reporter Jeff Horwitz. In
a five-part series, The Wall Street Journal used those documents to
reveal that not only was Facebook fueling teenage self-harm and enabling
human trafficking, but that Facebook itself also knew that its platform
contributed to those problems and chose to ignore it. The series revealed that:Facebook’s secret program, XCheck,
allows millions of VIPs to be exempt from the company’s normal content
moderation rules preventing hate speech, harassment, and incitements to
violence. A 2019 internal review of the program declared that its double
standard was a “breach of trust.”Facebook’s own research found that Instagram makes one-third of teenage girls feel worse about their bodies. Facebook tweaked its algorithms to “increase meaningful social interactions” but found that the shift actually boosted “misinformation, toxicity, and violent content.”Facebook has been lethargic about removing dangerous content
in developing countries including human trafficking and incitements to
ethnic violence—even when the content was flagged internally.Facebook was slow to address a content moderation loophole that allowed anti-vaccine advocates to flood Facebook with anti-vax comments, despite internal cries for a fix. In many
of these arenas, it’s been documented in the past that Facebook was
enabling harm. What’s new is proof that Facebook has long understood its
ills but has repeatedly failed to address them adequately. As the
Journal put it, “Facebook knows, in acute detail, that its platforms are
riddled with flaws that cause harm, often in ways only the company
fully understands.”Facebook
responded that the Journal was “cherry-picking” anecdotes that
mischaracterized its actions. “The fact that not every idea that a
researcher raises is acted upon doesn’t mean Facebook teams are not
continually considering a range of different improvements,” Nick Clegg,
Facebook’s vice president of global affairs, wrote in a blog post. At
the same time, new evidence of harm enabled by Facebook continued to
accumulate in other publications, including The Markup. The New York
Times reported that Facebook had rolled out a feature that was intended
to promote positive news about the social network in users’ feeds in part to bury negative press about itself. And ProPublica reported that Facebook Marketplace was riddled with fraudulent and scam listings. And,
this week, The Markup Citizen Browser project manager Angie Waller and
reporter Colin Lecher revealed that Facebook has been disproportionately amplifying Germany’s far-right political party during the run-up to tomorrow’s parliamentary elections.Although the political party Alternative for Germany (Alternative für Deutschland), or the AfD, is relatively small (it won just 12 percent of the vote in the 2017 parliamentary elections),
its supporters’ anti-immigrant and anti-pandemic-restrictions posts
were displayed more than three times as often as posts from much larger
rival political parties and their supporters in our German panelists’
news feeds.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://kwlug.org/pipermail/kwlug-disc_kwlug.org/attachments/20210925/58c4b1fa/attachment.htm>
More information about the kwlug-disc
mailing list