フェースブックで働いていた人はフェースブック会社はやっていた本研究の資料をコピーして漏洩した理由を説明するインタビュー。
(下にスクロールすると英語のセリフありますよ↓↓↓)
今日の記事のLink:
単語Links:
https://ejje.weblio.jp/content/testify
https://ejje.weblio.jp/content/Senate
https://ejje.weblio.jp/content/whistleblower
https://ejje.weblio.jp/content/Hate+speech
https://ejje.weblio.jp/content/misinformation
https://ejje.weblio.jp/content/law+enforcement
https://ja.wikipedia.org/wiki/%E5%88%A9%E7%9B%8A%E7%9B%B8%E5%8F%8D
https://ejje.weblio.jp/content/the+public
https://www.ecosia.org/search?q=divisive%20political%20speech%20%E6%97%A5%E6%9C%AC%E8%AA%9E
https://eow.alc.co.jp/search?q=tear%20apart
https://ejje.weblio.jp/content/eating+disorder
関連Links:
https://www.theguardian.com/technology/2021/oct/04/facebook-whistleblower-testify-us-senate
Me:
Site: http://www.tensaimon.com
SNS: tensaimon (Instagram: https://www.instagram.com/kusaimon/)
Credits:
Music by Kajiki
Sounds: https://freesound.org/people/tensaimon/bookmarks/
English Script (記事からの引用、まとめ):
Her name is Frances Haugen
an anonymous former employee filed complaints with federal law enforcement.
The complaints say Facebook’s own research shows that it amplifies hate, misinformation and political unrest—but the company hides what it knows.
One complaint alleges that Facebook’s Instagram harms teenage girls.
trove of private Facebook research she took when she quit in May
Frances Haugen is revealing her identity to explain why she became the Facebook whistleblower
conflicts of interest between what was good for the public and what was good for Facebook
And Facebook, over and over again, chose to optimize for its own interests, like making more money.
She secretly copied tens of thousands of pages of Facebook internal research
the company is lying to the public about making significant progress against hate, violence and misinformation
We have evidence
hat hate speech, divisive political speech and misinformation
are affecting societies around the world.”
he version of Facebook that exists today is tearing our societies apart a
you have your phone. You might see only 100 pieces of content if you sit and scroll on for, you know, five minutes. But Facebook has thousands of options it could show you.
The algorithm picks from those options based on the kind of content you’ve engaged with the most in the past.
optimizing for content that gets engagement, or reaction
it’s easier to inspire people to anger than it is to other emotions.
Misinformation, angry content– is enticing to people and
keeps them on the platform.
if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, they’ll make less money.
Facebook essentially amplifies the worst of human nature.
forcing us to take positions
that we know are bad for society.
if we don’t take those positions, we won’t win in the marketplace of social media.
Facebook’s own research says,
hey get more and more depressed. And it actually makes them use the app more