Graphic images and video of the aftermath of two bloody events in Texas this weekend — a shooting at a shopping center and a car crashing into a group of migrants — have circulated widely on Twitter over the past few days, sparking widespread outrage. renewed concern about the moderation capabilities of the platform. under CEO Elon Musk.
They were images that some users said he was pushed to your «For You» feedwhat was it introduced earlier this year and content surfaces based on Twitter’s recommendation system and a user’s preferences based on who they follow. Users can choose between the «For You» feed and the «Following» feed, which only shows tweets from accounts a user follows.
David Hogg, a gun control advocate and survivor of the Parkland shooting, tweeted on Sunday (in response to a survey by Musk that asked him if he had improved the platform in the last six months) that graphic images had been included in his «For you». feed.
«Well, yesterday I saw significantly more photos and videos of people dead from the most recent mass shooting in Texas on my For You page and on my timeline than ever before,» Hogg tweeted. «So it’s not great in that sense.»
That concern was countered by some calls for graphic images to be distributed more widely, a decades-old argument that has gained renewed momentum in recent years around the gun control debate.
«The whole genre of violent attack photos and videos is fraught with danger,» said Eric Goldman, a professor at Santa Clara University School of Law and co-director of the High Tech Law Institute. «On the one hand, many or most people don’t want to see it. On the other hand, it can be critical evidence of what happened, and in some cases, it can shock people into changing their minds. We both want to and don’t we want». I don’t want that content to be widely available.»
«That puts Internet services in a dead-end situation,» he added.
The Allen Mall videos, which NBC News has seen and is choosing not to link to or embed, show mutilated bodies piled on top of each other outside the mall, many covered in blood.
Twitter’s “confidential media policy” states that users “may not post media that is graphic or share adult or violent nudity and sexual behavior within a live video or in profile header, listing banner images, or community cover photos. Media that show excessively bloody content, violence and/or sexual assault, bestiality or necrophilia are also not allowed.
Twitter allows users to post «sensitive» media if they mark their account as such, which places images and videos behind a warning that must be clicked before the media is displayed.
It’s unclear how many employees Twitter has to enforce those rules. Musk has said that laid off about 80% of the company’s staffwith NBC News reporting that the company’s moderation teams were downsized.
Searches on Twitter Monday morning turned up a wide variety of unmoderated videos with no content warnings that appear to violate those rules, including pornography videos.
Goldman said it doesn’t know what Musk’s motive is, but noted the company’s pushback on restraint.
«Content moderation is quite difficult when done right,» he said, adding that the company has ways to host such videos while protecting users. «The question is, why didn’t Twitter put this behind a warning screen and was it the wrong choice?»
Ella Irwin, Twitter’s vice president of products that oversees trust and safety, did not immediately respond to a request for comment on the videos of the mall shooting. Musk tweeted over the weekend, sometimes mentioning the shooting to support mental health efforts.
YouTube spokesman Jack Malon said in an email that the company’s trust and security teams removed videos of the shooting at the mall over the weekend that violated company guidelines. Community Principles.
Another, less graphic video showed white sheets covering corpses surrounded by blood. Other video from a car dash cam showed the moment a man opened fire at the mall. Other videos came from people inside the mall documenting their efforts to hide until help arrived.
Videos from Brownsville, Texas, where eight people were killed and 10 others injured by an SUV, also showed a brutal and bloody scene. Some of those videos remain live on Twitter and have been seen by NBC News.
In the case of the graphic mall shooting video, many accounts posted it to Twitter in the hours after the incident. As of Monday morning, a Twitter search showed that most of those tweets had been deleted, though it was not entirely clear whether they had been deleted by those users or by Twitter.
That sentiment as not unanimous. Other users pushed for the videos to be seen by more people, arguing that the disturbing images need to be seen for people to fully understand the magnitude of these violent actions and the widespread availability of guns in the US.
It’s a point that has found some increasing traction as the broader issue of gun regulation remains frozen.
«I thought long and hard about whether to share the horrifying video showing the pile of bodies from the mass shooting at the outlet mall in Allen, Texas.» tweeted Jon Cooper, a Democratic fundraiser who has worked for President Joe Biden and former President Barack Obama.
“But maybe, just maybe, people NEED to see this video, so they will pressure their elected officials until they TAKE ACTION,” he added.
Jamelle Bouie, an opinion columnist for the New York Times, also said she thinks people need to see the aftermath of these shootings, but it’s also important who sees them.
“Years ago I wrote that the public needs to see the results of our experiment on unlimited gun ownership, and I still believe that is true. That includes the Supreme Court.” Bouie tweeted.
Emily Bell, founding director of the Columbia Journalism School’s Tow Center for Digital Journalism, offered a counterpoint.
«There is nothing virtuous or ethical about showing easily identifiable dead children and adults whose families may not yet know they are dead,» Bell said. tweeted. «It is profoundly unethical: it strips victims and their families of privacy and dignity in death. It only serves Musk’s click farm.»
The open circulation of graphic videos after violent events is a common occurrence on social media, and many consumer technology platforms have worked to limit it through a combination of human intervention and automated systems that can identify when certain pieces of media are being played. repeatedly. aware.
Many consumer technology companies have invested in efforts to limit the spread of graphic images, especially after violent events. But most also have policies that allow them to weigh whether a particular video is important for the public to view.
One of the most notable examples occurred around the video of the shooter who killed 49 people in Christchurch, New Zealand, with Twitter and YouTube fighting to remove copies that were repeatedly uploaded. Such videos are generally considered violent propaganda.
Most recently, YouTube allowed police body camera video of a school shooting to remain online, with the company claiming the video was in the public interest.
Goldman said thinking on restraint has changed since the Christchurch shooting, pointing to examples of graphic photos now considered historically significant, such as the «Napalm Girl» photo.
«Seeing that photo helped change the American view of the Vietnam War,» he said.
Still, Goldman said there’s no clear consensus on what platforms should do with videos like the ones that surfaced last weekend.
“The content moderation dilemmas that Internet companies face in these circumstances cannot please everyone,” he said. «I feel like this is another time like that.»