The Patriot Files Forums

The Patriot Files Forums (http://www.patriotfiles.com/forum/index.php)
-   Terrorism (http://www.patriotfiles.com/forum/forumdisplay.php?f=110)
-   -   CEP Statement On Section 230 Reform And The Justice Against Malicious Algorithms Act (http://www.patriotfiles.com/forum/showthread.php?t=1520337)

Boats 10-14-2021 12:08 PM

CEP Statement On Section 230 Reform And The Justice Against Malicious Algorithms Act
 
CEP Statement On Section 230 Reform And The Justice Against Malicious Algorithms Act of 2021
By: Hany Farid:
Re: https://mail.google.com/mail/u/0/?ta...gNxGZdJJQDQSvh
Topic: Algorithmic Amplification Is A Key Driving Force For Spreading Problematic, Harmful Content Online”

(New York, N.Y.) — Dr. Hany Farid, Counter Extremism Project (CEP) senior advisor and professor at the University of California, Berkeley, released the following statement in support of the Justice Against Malicious Algorithms Act of 2021. This bill—which will be introduced tomorrow by Energy and Commerce Committee Chairman Frank Pallone, Jr. (D-NJ), Communications and Technology Subcommittee Chairman Mike Doyle (D-PA), Consumer Protection and Commerce Subcommittee Chair Jan Schakowsky (D-IL), and Health Subcommittee Chair Anna Eshoo (D-CA)—narrowly amends Section 230(c) of the Communications Decency Act (CDA) and would lift the liability shield when an online platform knowingly or recklessly deploys recommendation algorithms to promote content that materially contributes to physical or severe emotional injury.

“Algorithmic amplification is a key driving force for spreading problematic, harmful content online. As it stands, for-profit tech companies have a business incentive to increase engagement on their platforms—including by promoting divisive, hateful, and extremist content—in order to increase revenues. Legislation is clearly needed to shift that corporate calculation.

“The proposed bill is a sensible legislative solution that holds the tech industry accountable for their reckless behavior in proliferating content ranging from child sex abuse, terrorism, the sale of illegal narcotics and weapons, to misinformation.”

In March 2020, Dr. Farid and other UC Berkeley researchers authored a study, A Longitudinal Analysis Of YouTube’s Promotion Of Conspiracy Videos, that analyzed YouTube’s policies and efforts towards curbing its recommendation algorithm’s tendency to spread divisive conspiracy theories. After reviewing eight million recommendations over 15 months, researchers determined the progress YouTube claimed in June 2019 to have reduced the amount of time its users watched recommended videos including conspiracies by 50 percent—and in December 2019 by 70 percent—did not make the “problem of radicalization on YouTube obsolete nor fictional.” The study ultimately found that a more complete analysis of YouTube’s algorithmic recommendations showed the proportion of conspiratorial recommendations are “now only 40 percent less common than when the YouTube’s measures were first announced.”

On June 30, 2021, CEP hosted a webinar with Dr. Farid to explore the nature and extent of the global phenomenon of misinformation as well as the role of algorithmic amplification in promoting misinformation and divisive content online and its devastating consequences. The webinar also examined several technological and regulatory interventions that could potentially curb misinformation, including the upcoming EU Digital Services Act.


All times are GMT -7. The time now is 07:10 AM.

Powered by vBulletin, Jelsoft Enterprises Ltd.