Facebook goes public with rules
Last week Facebook published its Community Standards — the guidelines that its 7,500 moderators use to decide which content to remove and which people or pages to ban. Also for the first time, users can now appeal when they believe their posts have been removed unfairly.
The move toward transparency comes on the heels of the Cambridge Analytica data breach and almost two years after allegations were published that the curation team for Facebook's “trending” section had suppressed certain stories. (Months later, the company would turn its trending story curation over to an algorithm — a move that generated new problems and criticism.)
The world's largest social media platform has faced accusations of inconsistent rules enforcement and political bias ever since, including when it recently deemed posts from pro-Trump activists Diamond and Silk “unsafe to the community” (in error, it said later) and when its automated spam filter temporarily blocked users from sharing a Daily Caller article containing the just-released full transcript of text messages between two former FBI employees who were critical of President Trump.

- Note: Facebook’s Community Standards rules are not new — though its content-moderation experts update them regularly — but the challenges inherent in enforcing them are clear in the introduction: They apply to 2 billion people in 40 countries. Hate speech laws vary, as do local definitions of what is considered “indecent” or “vulgar.” And even with guidelines, it takes editorial judgment — which is necessarily subjective — and understanding of context to determine whether something is “offensive” versus “dangerous” (in Myanmar, human rights activists say their complaints have long gone unheeded) or whether a Pulitzer Prize-winning photo of a naked girl fleeing a napalm attack in Vietnam is “inappropriate” child nudity.
- Also note: Facebook increased its staff of moderators by 3,000 only in 2017 after a rash of live videos were posted showing rapes, killings and shootings. While Facebook has long argued that it is a platform, not a publisher, the recently released guidelines resemble the ethics guidelines found in many newsrooms.
- Discuss: Should Facebook moderate the content that is shared on its platform, or should it allow users to share anything they want? Why? What other social media platforms moderate content? How do the moderation guidelines from Facebook compare with those of other platforms, such as YouTube? Should social media platforms moderate and curate the topics that trend, or should these be entirely driven by raw use metrics, even if the topics are potentially offensive or dangerous?
|
|
Quick hits
- Why do local news organizations share news that isn't local?
Shan Wang, a staff writer at NiemanLab, analyzed Facebook posts by the news divisions at 28 local TV stations and discovered that just over half of those posts could be considered legitimately “local.” She also found evidence of sharing patterns among stations owned by the same corporate parents.
- Discuss: Why might local news outlets share nonlocal stories on Facebook? Does this count as “engagement baiting”? Why or why not?
- Idea: Continue and extend Wang’s work by surveying the Facebook posts of local news outlets in your community. What percentage are truly local? Are nonlocal stories always labeled as such to avoid confusion? Are there differences in the posting trends of local TV, radio and print news organizations? Have students email or tweet to Wang to learn more about her methodology or to share their results.
- Road map to requesting your data profile
An app developer at ProPublica has published a piece that includes step-by-step instructions for requesting the data that several major platforms and data brokers have collected about you. The list includes Cambridge Analytica — which is located in England and so must comply with strict U.K. data privacy regulations but still manages to make it extremely difficult to get a copy of the data it has about you — along with Oracle, Facebook and Google and the data brokers ALC Digital, Experian and Epsilon.
- Idea: Divide the list of companies in the article among groups of students and have each request one person’s data profile (either a student, a friend or a family member who agrees). Have each group log the steps and track the time it takes to retrieve the requested data, then share highlights and insights (with the permission of the subject, of course) with the class.
- Malaysia hands out first punishment for 'fake news'
A Malaysian judge sentenced a Danish citizen visiting Malaysia to a week in jail and a fine of 10,000 ringgit (about $2,500) for posting a video on YouTube that accused the government of an intentionally slow response to the shooting of a Palestinian man in Kuala Lumpur in April. The sentence was the first punishment under a new “fake news” law that prohibits the deliberate creation and sharing of false information in Malaysia. (h/t Mike Caulfield for alerting us to this.)
- Discuss: Should all countries enact laws punishing the deliberate creation and sharing of false information? Why or why not?
|
|
Please share this newsletter with others who may find this information useful (archives and subscribe form here). For more examples and ideas like these, you can follow me on Twitter (@PeterD_Adams). Also follow @NewsLitProject and @MrSilva.
If you have suggestions for future issues of The Sift, please share them here.
If you're looking for engaging and effective news literacy resources, check out NLP's checkology® virtual classroom. We’re giving away student licenses for 1:1 functionality for the rest of the 2017-18 school year. Yes, it’s free.
|
|
|
|
|
|