Misinformation

You are here

Curious About How Conspiracy Theories Get Spread Online?

The latest online attacks against the teen survivors of the Parkland shooting is a good case study on how this happens and how quickly it occurs. An article in The Washington Post entitled We studied thousands of anonymous posts about the Parkland attack – and found a conspiracy in the making outlines the part that anonymous social media forums play in the process. It’s a primer on how misinformation is created on purpose, endures endlessly, and the havoc that it plays in lives of those who are targeted.

Who Sponsored That Ad?

The Federal Election Commission has drawn up a proposed framework that would require political digital and social media ads to adopt the same sponsorship disclaimer rules as those appearing on TV, read on radio and in print. Political audio and video ads on both social and digital platforms would require candidates paying for the ads to say their names and include the statement, "And I approve of this message," and graphic and text ads would have to display the sponsor's name "in letters of sufficient size to be clearly readable," the proposal says. In addition, Facebook has announced that it will mail postcards to political ad buyers to verify that they live in the US. A code from the postcard will be needed to buy a political ad on the platform, and November's midterm elections will be the first time the process is used.

Facebook Users Vet New Sources

Facebook's latest news feed update will include a prioritization of news sources rated as trustworthy by "a diverse and representative sample" of its users, the company's News Feed chief Adam Mosseri wrote in a recent blog post. Publications with lower scores could see a decrease in distribution while there will also be an emphasis on promoting local news. Mark Zuckerberg, Facebook CEO, recently writing on the same subject said that prioritizing news from trusted publishers is part of Facebook’s broader effort to revamp the News Feed and “encourage meaningful social interactions with family and friends over passive consumption.”

Facebook Maybe Losing the War on Hate Speech

Can Facebook actually keep up with the hate speech and misinformation that pores through the portal? Facebook seems to be working on the misinformation side. But even that is coming in fits and starts. Facebook had to retreat from using red flags that signal articles are fake news after discovering the flags instead spurred people to click on them or share them and has gone instead to listing links below the article to related articles with a more balanced view.

 

Now a new investigation from ProPublica shows that Facebook’s policy regarding hate speech is also having issues. In an analysis of almost 1,000 crowd-sourced posts, reporters at ProPublica found that Facebook fails to evenly enforce its own community standards. A sample of 49 of the 900 posts were sent directly to Facebook, which admitted that in 22 cases its human reviewers had erred, mistakenly flagging frank conversations about sexism or racism as hate speech. The problem is that the company also does not offer a formal appeal process for decisions its users disagree with so seemingly innocent outbursts may also get caught up in the reviewer’s net.

 

It is definitely a tough issue and this year Germany will enforce a law requiring social-media sites—including Facebook, YouTube, and Twitter, but also more broadly applying to Web sites like Tumblr, Reddit, and Vimeo—to act within 24 hours to remove illegal material, hate speech, and misinformation. Failure to do so could lead to fines of up to 50 million euros, or about $60 million. Is this this what should be done here in the US or is that too strict? Perhaps the topic of policing content is a good dinner table discussion to have with your teens?

No More Red Flags on Facebook

Facebook is getting rid of the red flags that signal articles are fake news after discovering the flags instead spurred people to click on them or share them. The company is instead including related links under such articles that will provide more trustworthy sources reporting on the topic. The “related articles” effort is something Facebook started testing earlier this year. By the way, if you do try to share posts with contentious content, a message will pop up telling you that you may out to check out other sources before you do so. Or in other words, you won’t be able to use the excuse that you had ‘no idea” that article you passed on might have false or unproven content.

Former President Obama Talks to Prince Harry About Social Media

Former President Barack Obama and the United Kingdom's Prince Harry took to the airwaves for a recent BBC interview where they discussed the potential dangers of social media and how it should be used to promote diversity and find common ground. "One of the dangers of the internet is that people can have entirely different realities. They can be cocooned in information that reinforces their current biases," Obama stated. The former president also echoed something that parents concerned about their kids growing up in a Digital Age try to communicate to their children reiterating that " the truth is that on the internet everything is simplified and when you meet people face to face it turns out they are complicated." Perhaps, something every cyberbully should remember?

YouTube Kids To Get More Human Moderators

Snapchat Takes Aim at Misinformation

Snapchat is taking aim at misinformation with some unconventional changes to the design of the app (which for many parents is an app that has been associated with cyberbullying and sexting in the past). While the app will still initially open to the phone camera, allowing users to make and share photos that disappear with friends, the new design will try to separate personal (social) side of the app from what is produced by outside media sources. The media part will also be vetted and approved by Snap, the parent company, by humans, not by algorithms. The use of human curators will allow Snapchat to also program content to make sure that users’ preferences are not keeping them from seeing a wide array of opinions and ideas. In addition to winnowing out fake news, this may keep Snapchat from becoming a place that reinforces narrow sets of thinking. This approach is in contrast to Facebook and Google, who have not vetted much of the hate speech, fake news, and even disturbing videos aimed at children that has been proliferated on those platforms over time.

The Trust Project and Fake News

Still worried about falling into a “fake news” trap by reading or passing along something that isn’t factual? A non partisan effort, by a group hosted at Santa Clara University, called The Trust Project is working to address this situation by helping online users distinguish between reliable journalism and promotional content or misinformation. Recently, Facebook started offering “Trust Indicators” which is a new icon that will appear next to articles in your News Feed. When you click on this icon, you can read information the publisher has shared about their organization’s “ethics and other standards, the journalists’ backgrounds, and how they do their work,” according to an announcement from The Trust Project.

It is a work in progress with Facebook, Google, Bing and Twitter and other international new organizations committing to displaying these indicators, although not all implementations are in place.

The onus to figure out if something is fake though is still on the user. Instead of labeling content as disputed, Trust Indicators allow users to learn more about the organization behind the news and come to their own conclusions about the content. Whether it will actually help in the long-run, of course, remains to be seen.

Bunk – The History of Plagiarism, Hoaxes and Fake News

We continue to need to talk to kids about how to evaluate sources online and off, but we all should probably know more about the history of the hoaxes, plagiarism and fake news. A new book entitled Bunk – The Rise of Hoaxes, Humbug, Plagiarists, Phonies, PostFacts, and Fake News by Kevin Young draws connections between the days of P.T.Barnum and the 21st century and compares terms like swindler and confidence man to contemporary buzzwords like plagiarismtruthiness and fake news. More than just telling tales of hoaxes revealed, Young discusses the theory of the hoax and the effects of the deception on politics, online news and everyday life then and now

Pages