Reported by some as a set of guidelines for regulating social media
https://www.bbc.com/news/technology-54901083 the policy framework that has been released by the Working Group on Infodemics
https://informationdemocracy.org/working-groups/concrete-solutions-against-the-infodemic/
is something many of us should be examining, and possibly critiquing. The policy framework itself can be found at:
https://informationdemocracy.org/wp-content/uploads/2020/11/ForumID_Report-on-infodemics_101120.pdf
The working group is supported by 38 countries, so this framework will likely have wide currency and impact. Looking at the composition of the working group is interesting. The majority are *NOT* technical people, but those from political or media backgrounds. It is good that techies aren't the only ones involved, but the lack of a strong technical background may show in the limited ability to implement some of the major recommendations.
The report itself is 128 pages long, but the twelve main recommendations (divided into four categories) are listed on pages 14 and 15. They are:
PUBLIC REGULATION IS NEEDED TO IMPOSE TRANSPARENCY REQUIREMENTS ON ONLINE SERVICE PROVIDERS.
1. Transparency requirements should relate to all platforms' core functions in the public information ecosystem: content moderation, content ranking, content targeting, and social influence building.
2. Regulators in charge of enforcing transparency requirements should have strong democratic oversight and audit processes.
3. Sanctions for non-compliance could include large fines, mandatory publicity in the form of banners, liability of the CEO, and administrative sanctions such as closing access to a country's market.
A NEW MODEL OF META-REGULATION WITH REGARDS TO CONTENT MODERATION IS REQUIRED.
4. Platforms should follow a set of Human Rights Principles for Content Moderation based on international human rights law: legality, necessity and proportionality, legitimacy, equality and non discrimination.
5. Platforms should assume the same kinds of obligation in terms of pluralism that broadcasters have in the different jurisdictions where they operate. An example would be the voluntary fairness doctrine.
6. Platforms should expand the number of moderators and spend a minimal percentage of their income to improve quality of content review, and particularly, in at-risk countries.
NEW APPROACHES TO THE DESIGN OF PLATFORMS HAVE TO BE INITIATED.
7. Safety and quality standards of digital architecture and software engineering should be enforced by a Digital Standards Enforcement Agency. The Forum on Information and Democracy could launch a feasibility study on how such an agency would operate.
8. Conflicts of interests of platforms should be prohibited, in order to avoid the information and communication space being governed or influenced by commercial, political or any other interests.
9. A co-regulatory framework for the promotion of public interest journalistic contents should be defined, based on self-regulatory standards such as the Journalism Trust Initiative; friction to slow down the spread of potentially harmful viral content should be added.
SAFEGUARDS SHOULD BE ESTABLISHED IN CLOSED MESSAGING SERVICES WHEN THEY ENTER INTO A PUBLIC SPACE LOGIC.
10. Measures that limit the virality of misleading content should be implemented through limitations of some functionalities; opt-in features to receive group messages, and measures to combat bulk messaging and automated behavior.
11. Online service providers should be required to better inform users regarding the origin of the messages they receive, especially by labeling those which have been forwarded.
12. Notification mechanisms of illegal content by users, and appeal mechanisms for users that were banned from services should be reinforced.