Cyberbullying

You are here

Twitter’s Transparency Center Reveals Where Enforcement Stands

Many parents are pressing social media companies to clamp down on cyberbullying and misinformation and most social media sites give lots of lips service to the fact that they are doing more about it or stepping up enforcement. But what does that actually look like? Take a look at Twitter’s new Transparency Report with sections dedicated to various elements of enforcement. Over the most recent six months, Twitter has been working to enforce various elements of concern, including discussion around the #BlackLivesMatter movement, misinformation around COVID-19, and controversial, divisive political content - including commentary from the US President

Almost Half of Teens’ Romantic Relationships End in Online Harassment

Sadly, 48% of US teens who have been in a romantic relationship say they have been stalked or experienced harassment after the relationship ended. The rise of social media seems to play a huge role in this alarming stat because it presents so many more communication channels for harassment and/or virtual stalking. Lessons for parents? Pay attention to what’s going on, and be there to guide tweens and teens if there are signs that a relationship is becoming unhealthy.

Ready for a Pandemic Gaming Party?

Party Place is a new feature on Roblox being beta tested that allows kids to create private, mini-social networks exclusively with friends to chat, hang out, and plan which games to play. The venue itself doesn’t offer any activities or games, but rather serves as a private place for Roblox users to gather — for example, for a virtual birthday party, a remote learning activity with classmates, for virtual playdates, or anything else. From Party Place, the group can chat and hang out as they decide which Roblox game they plan to play next.

For today’s younger players, platforms like Fortnite and Roblox are becoming their own version of a social network. The kids don’t just go online to play. They socialize, chat and hang out with a mix of real-life friends and virtual ones, blurring the lines between online and offline in ways that traditional social networks, like Facebook, do not. Of course this also opens up another avenue for cyberbullying, so as with all forms of social media be sure to monitor for the symptoms that your child may be a target.

Oh, no, Omegle Again.

As quarantine boredom sinks to new lows, kids are turning to random-video-chat platforms like Omegle to see what other bored kids are doing. But it's not recommended. Omegle is a website that pairs random visitors through video and text chat and has spiked in popularity over the last four months. The site is similar to the once wildly popular Chatroulette, which is also experiencing a renaissance of sorts, in that it is free, requires no registration, and promises a surprising social experience. Visitors can submit keywords to filter for people with shared interests. Those in college can enter a .edu email address, which the site uses for verification to find other students. There is also, predictably, an “adult” section. But as the site disclaims, predators and bullies has been known to use the platform and recommends that no one under 13 use the service. Keep this in mind if you hear your kids mention this app so you can monitor their use and remind them about not giving away any personal information - especially during a video chat. They could unintentionally be giving away personal information simply by being in a room with pictures in the background or other clues to their whereabouts, their family, their school or other personal information.

eSports: The Things You Don’t Think About

 eSports played at the high school level has received lots of positive attention, but as Elliot Levine points out in this article entitled Addressing the 800-pound gorillas in scholastic esports, challenges still exist. While fun, spirited “trash talk” has become part of online gaming culture, harassing and bullying other players crosses the line. In fact, a 2019 survey from the Anti-Defamation League found that 65 percent of players reported experiencing “severe harassment” including physical threats, sustained harassing and stalking. Prolonged play can also lead to physical injuries, including eye fatigue and neck, back, wrist and hand pain. These are all things to be aware of and keep in mind if your children decide to take up eSports.

Groups Urge Facebook Advertisers to Boycott Platform Over Hate Speech

Civil Rights Groups including The Anti-Defamation League, Color of Change, Common Sense Media, Free Press, the NAACP and Sleeping Giants, are launching a social media campaign, #StopHateForProfit, to urge large Facebook advertisers to boycott the platform unless it makes formal moves to curtail the proliferation of hate speech on its platform. The group is also requesting Facebook to take steps such as removing ads labeled as misinformation or hateful, and informing advertisers when their media buys appear near harmful content and grant refunds. The list of those companies taking part is growing by the day, although critics have questioned the effectiveness, pointing out these companies are not taking down their pages and will most likely buy more ads on Facebook after July.

These actions are one example of recent backlash against Facebook, which seemed to intensify when a flurry of misinformation appeared on the social platform amid worldwide protests against racism and police brutality. The company declined to take action against posts from President Trump — despite Twitter flagging that same content as misleading or glorifying violence. Facebook did remove ads from Mr. Trump’s re-election campaign that featured a symbol used by the Nazis during World War II. The company also announced that it would gradually allow users to opt out of seeing political ads, and has acknowledged in a blog post that its enforcement of content rules “isn’t perfect.”

Addressing Privacy in Video Conferencing on Online Classes

A reminder for parents and kids that participating in a video meeting for school work, extracurriculars, or just socializing, provides a window into your homes. Parents should help kids think about their surroundings and what may be visible during an online class meeting. Both Zoom and Meet allow users to change the background image, a feature that addresses privacy and helps students who might feel insecure about their homes. Cyberbullies love to feed on any kind of personal information that might be revealed in what is hanging on your walls, interactions with family members while online, and other clues to your family’s life, so it is worth taking the time to creating the right background for an online class and reminding family members to give the online participant space.

Understanding Section 230 of the Communications Decency Act

It will be very interesting to see what effect the new Executive Order that President Trump signed recently targeting Section 230 of the Communication Decency Act has on cyberbullying and misinformation online. While you may have never heard of this section of the law, it was created almost 30 years ago to protect Internet platforms from liability for many of the things third parties say or do on them. But now it’s under threat by President Trump, who hopes to use this act to fight back against the social media platforms he believes are unfairly censoring him and other conservative voices. Some critics say he is trying to bully these platforms into letting him post anything he wants without correction or reprimand, even when he has broken a site’s rules about posting bullying comments or questionable information.

In a nutshell, Section 230 says that Internet platforms that host third-party content (for example, tweets on Twitter, posts on Facebook, photos on Instagram, reviews on Yelp, or a news outlet’s reader comments) are not liable for what those third parties post (with a few exceptions). For instance, if a Yelp reviewer were to post something defamatory about a business, the business could sue the reviewer for libel, but it couldn’t sue Yelp. Without Section 230’s protections, the Internet as we know it today would not exist. If the law were taken away, many websites driven by user-generated content would likely be shut down. As Senator Ron Wyden, one of the authors of the Section 230 says about it, the law is both a sword and a shield for platforms: They’re shielded from liability for user content, and they have a sword to moderate it as they see fit.

That doesn’t mean Section 230 is perfect. Some argue that it gives platforms too little accountability, allowing some of the worst parts of the internet — think cyberbullying that parents or schools struggle to have removed or misinformation that stays online for all to see with little recourse— to flourish along with the best. Simply put, Internet platforms have been happy to use the shield to protect themselves from lawsuits, but they’ve largely ignored the sword to moderate the bad stuff their users upload. It is also important to remember that the cyberbullying that occurs is less than one tenth of one percent of all the traffic online, but it still is important for these sites to acknowledge their role and do more about it.

All that said, this protection has allowed the Internet to thrive. Think about it: Websites like Facebook, Instagram, and YouTube have millions and even billions of users. If these platforms had to monitor and approve every single thing every user posted, they simply wouldn’t be able to exist. No website or platform can moderate at such an incredible scale, and no one wants to open themselves up to the legal liability of doing so. But if that free flow of information and creativity goes away, our online world will be very different.

So where do we stand? While the executive order sounds strict ( and a little frightening with the government making “watch lists” of people who post or support “certain kinds” of content) , legal experts don’t seem to think much — or even any — of it can be backed up, citing First Amendment concerns. It’s also unclear whether or not the Federal Communications Commission has the authority to regulate Section 230 in this way, or if the president can change the scope of a law without any congressional approval.

Facebook Planning to Use Artificial Intelligence To Combat Hateful Memes

Facebook is combating hate speech and misinformation by developing natural language processing models and a database of meme examples for training artificial intelligence moderators. The company, together with DrivenData, will also launch the Hateful Meme Challenge, which will award $100,000 to researchers who develop AI models that can detect hate speech in memes.

Where Did That Facebook Post Come From?

In its latest effort to improve content transparency leading up to the US election, Facebook is adding new location markers on individual business profile posts, on both Facebook and Instagram, which will highlight where the managers of that page or account are primarily located, helping to provide additional context. That’s another tool for you and your kids to use in combatting cyberbullying and misinformation. This new transparency feature will be particularly important with the election in the fall of 2020.

Pages