“So why’d they totally blow their response to the livestream and the response time after?” ![]() “Facebook went on record in early August saying they were returning back to normal moderation rates, but that their AI tech actually had been improved during the COVID slow downs,” Steen said. ![]() Perhaps these are just the ones slipping through the cracks, while thousands more are nipped in the bud, but why should we give a company like Facebook, which commands billions of dollars and tens of thousands of employees, the benefit of the doubt when they fail for the nth time on something so important? What’s the point of giving people a second or third chance here?įacebook couldn’t seem to decide whether the content is in violation or not, as evidenced by several re-uploads of the content in various forms that were not taken down when flagged. TikTok, for instance, bans any account that makes multiple attempts to upload the clip. It’s disappointing that the largest video platforms on the planet, which seem to never cease crowing about their prowess in shutting down this kind of content, don’t seem to have any serious response. He sent screenshots and video showing ads from Squarespace and the Motley Fool running ahead of the video of McNutt. YouTube is another, later offender: Steen and others have captured many cases of McNutt’s video or image being used, sometimes being monetized. TikTok is trying to remove a disturbing video showing up on people’s For You pages Surely even if it’s not possible to keep the content off the service entirely, there ought to be something preventing it from being actively recommended to people. At the same time, it’s difficult to imagine how other platforms are caught flat-footed: TikTok had the video queued up in users’ “For You” page, exposing countless people by an act of algorithmic irresponsibility. When I asked Facebook about this, I received the same statement others have: “We are reviewing how we could have taken down the livestream faster.” One certainly hopes so.īut Facebook cannot contain the spread of videos like this - and the various shootings and suicides that have occurred on its Live platform in the past - once they’re out there. “It’s pure speculation, but I think if they’d have cut his stream off he wouldn’t have ended his life.” “I firmly believe, because I knew him and how these interactions worked, had the stream ended it would’ve diverted his attention enough for SOME kind of intervention,” Steen wrote in an email. McNutt’s friend and podcast co-host Josh Steen told TechCrunch that the stream had been flagged long before he killed himself. ![]() We let people know about this and if they felt we made a mistake, we still gave people the option to tell us they disagreed with our decision. The number of appeals is also much lower in this report because we couldn’t always offer them. With fewer content reviewers, we took action on fewer pieces of content on both Facebook and Instagram for suicide and self-injury, and child nudity and sexual exploitation on Instagram. In a “community standards enforcement report” issued Friday, Facebook admitted that its army of (contractor) human reviewers, whose thankless job it is to review violent and sexual content all day, had been partly disabled due to the pandemic. How could something so graphic and plainly violating the platform’s standards, being actively flagged by users, be allowed to stay up for so long? ![]() The video of Ronnie McNutt’s suicide originated on August 31, and took nearly three hours to take down in the first place, by which time it had been seen and downloaded by innumerable people. Platforms scramble as ‘Plandemic’ conspiracy video spreads misinformation like wildfireįor all the platforms’ talk of advanced algorithms and instant removal of rule-violating content, these events seem to show them failing when they count the most: In extremity.
0 Comments
Leave a Reply. |