[continued from previous message]
https://lite.cnn.com/en/article/h_d22fe985f2a974bf129aa1e5b4459476
------------------------------
Date: Thu, 12 Nov 2020 10:20:49 -0800
From: Rob Slade <
rslade@gmail.com>
Subject: Working Group on Infodemics Policy Framework, Nov. 2020
Reported by some as a set of guidelines for regulating social media
https://www.bbc.com/news/technology-54901083 the policy framework that has
been released by the Working Group on Infodemics
https://informationdemocracy.org/working-groups/concrete-solutions-against-the-infodemic/
is something many of us should be examining, and possibly critiquing. The policy framework itself can be found at:
https://informationdemocracy.org/wp-content/uploads/2020/11/ForumID_Report-on-infodemics_101120.pdf
The working group is supported by 38 countries, so this framework will
likely have wide currency and impact. Looking at the composition of the working group is interesting. The majority are *NOT* technical people, but those from political or media backgrounds. It is good that techies aren't
the only ones involved, but the lack of a strong technical background may
show in the limited ability to implement some of the major recommendations.
The report itself is 128 pages long, but the twelve main recommendations (divided into four categories) are listed on pages 14 and 15. They are:
PUBLIC REGULATION IS NEEDED TO IMPOSE TRANSPARENCY REQUIREMENTS ON ONLINE SERVICE PROVIDERS.
1. Transparency requirements should relate to all platforms' core functions
in the public information ecosystem: content moderation, content ranking, content targeting, and social influence building.
2. Regulators in charge of enforcing transparency requirements should have strong democratic oversight and audit processes.
3. Sanctions for non-compliance could include large fines, mandatory
publicity in the form of banners, liability of the CEO, and administrative sanctions such as closing access to a country's market.
A NEW MODEL OF META-REGULATION WITH REGARDS TO CONTENT MODERATION IS
REQUIRED.
4. Platforms should follow a set of Human Rights Principles for Content Moderation based on international human rights law: legality, necessity and proportionality, legitimacy, equality and non discrimination.
5. Platforms should assume the same kinds of obligation in terms of
pluralism that broadcasters have in the different jurisdictions where they operate. An example would be the voluntary fairness doctrine.
6. Platforms should expand the number of moderators and spend a minimal percentage of their income to improve quality of content review, and particularly, in at-risk countries.
NEW APPROACHES TO THE DESIGN OF PLATFORMS HAVE TO BE INITIATED.
7. Safety and quality standards of digital architecture and software engineering should be enforced by a Digital Standards Enforcement Agency.
The Forum on Information and Democracy could launch a feasibility study on
how such an agency would operate.
8. Conflicts of interests of platforms should be prohibited, in order to
avoid the information and communication space being governed or influenced
by commercial, political or any other interests.
9. A co-regulatory framework for the promotion of public interest
journalistic contents should be defined, based on self-regulatory standards such as the Journalism Trust Initiative; friction to slow down the spread
of potentially harmful viral content should be added.
SAFEGUARDS SHOULD BE ESTABLISHED IN CLOSED MESSAGING SERVICES WHEN THEY
ENTER INTO A PUBLIC SPACE LOGIC.
10. Measures that limit the virality of misleading content should be implemented through limitations of some functionalities; opt-in features to receive group messages, and measures to combat bulk messaging and automated behavior.
11. Online service providers should be required to better inform users regarding the origin of the messages they receive, especially by labeling
those which have been forwarded.
12. Notification mechanisms of illegal content by users, and appeal
mechanisms for users that were banned from services should be reinforced.
------------------------------
Date: Mon, 1 Aug 2020 11:11:11 -0800
From:
RISKS-request@csl.sri.com
Subject: Abridged info on RISKS (comp.risks)
The ACM RISKS Forum is a MODERATED digest. Its Usenet manifestation is
comp.risks, the feed for which is donated by panix.com as of June 2011.
SUBSCRIPTIONS: The mailman Web interface can be used directly to
subscribe and unsubscribe:
http://mls.csl.sri.com/mailman/listinfo/risks
SUBMISSIONS: to risks@CSL.sri.com with meaningful SUBJECT: line that
includes the string `notsp'. Otherwise your message may not be read.
*** This attention-string has never changed, but might if spammers use it.
SPAM challenge-responses will not be honored. Instead, use an alternative
address from which you never send mail where the address becomes public!
The complete INFO file (submissions, default disclaimers, archive sites,
copyright policy, etc.) is online.
<
http://www.CSL.sri.com/risksinfo.html>
*** Contributors are assumed to have read the full info file for guidelines!
OFFICIAL ARCHIVES: http://www.risks.org takes you to Lindsay Marshall's
searchable html archive at newcastle:
http://catless.ncl.ac.uk/Risks/VL.IS --> VoLume, ISsue.
Also,
ftp://ftp.sri.com/risks for the current volume/previous directories
or
ftp://ftp.sri.com/VL/risks-VL.IS for previous VoLume
If none of those work for you, the most recent issue is always at
http://www.csl.sri.com/users/risko/risks.txt, and index at /risks-32.00
ALTERNATIVE ARCHIVES:
http://seclists.org/risks/ (only since mid-2001)
*** NOTE: If a cited URL fails, we do not try to update them. Try
browsing on the keywords in the subject line or cited article leads.
Apologies for what Office365 and SafeLinks may have done to URLs.
Special Offer to Join ACM for readers of the ACM RISKS Forum:
<
http://www.acm.org/joinacm1>
------------------------------
End of RISKS-FORUM Digest 32.37
************************
--- SoupGate-Win32 v1.05
* Origin: fsxNet Usenet Gateway (21:1/5)