A long, but quite thoughtful, piece from the MIT Technology Review points out that any and all of Facebook's attempts to deal with misinformation and extremism are being derailed by the focus on Facebook's eternal growth. (As they say, growth for the sake of growth is the ideology of the cancer cell.)
Some interesting bits:
"The models that maximize engagement also favor controversy, misinformation, and extremism: put simply, people just like outrageous stuff."
"In 2017, Chris Cox, Facebook’s longtime chief product officer, formed a new task force to understand whether maximizing user engagement on Facebook was contributing to political polarization. It found that there was indeed a correlation, and that reducing polarization would mean taking a hit on engagement. In a mid-2018 document reviewed by the Journal, the task force proposed several potential fixes, such as tweaking the recommendation algorithms to suggest a more diverse range of groups for people to join. But it acknowledged that some of the ideas were “antigrowth.” Most of the proposals didn’t move forward, and the task force disbanded."
"But anything that reduced engagement, even for reasons such as not exacerbating someone’s depression, led to a lot of hemming and hawing among leadership. With their performance reviews and salaries tied to the successful completion of projects, employees quickly learned to drop those that received pushback and continue working on those dictated from the top down."
@rslade wrote:...
"But anything that reduced engagement, even for reasons such as not exacerbating someone’s depression, led to a lot of hemming and hawing among leadership. With their performance reviews and salaries tied to the successful completion of projects, employees quickly learned to drop those that received pushback and continue working on those dictated from the top down."
My Momma always said, "Evil is as evil does."
Craig
Rant
Yeah, I'm not sure Facebook really gets it. For example, when you report content that clearly goes against their Community Standards the first thing the they want you to do is to stop following that specific person or to stop seeing their posts. I can still be friends with people I disagree with but I will hold you accountable for your published content and on being a good person. How does not following that person prevent the content from getting spread around the platform? Yeah, I don't see it on my feed but that doesn't mean it's not being circulated. Operators (see what I did there?) have to take an additional step to report it. Maybe they think we lack object permanence?