Facebook blames AI for failing to catch New Zealand attack video

Share

"We are re-examining our reporting logic and experiences for both live and recently live videos in order to expand the categories that would get to accelerated review", Rosen explained. This was 12 minutes after the end of the livestream.

"Our responsible investment decisions are guided by New Zealand law and major policy positions of the New Zealand Government".

"This particular video did not trigger our automatic detection systems", Rosen wrote. As part of its probe into the incident, Facebook reviewed 1.5 million reuploads of the attack.

The New Zealand broadcast, Facebook said, was reported for reasons "other than suicide and as such it was handled according to different procedures".

Under fire for taking so long to remove the Christchurch terrorist's video of his mosque attacks, Facebook has come up with an utterly lame excuse: namely, that no one reported it as a suicide video or triggered other "protocols" that would've gotten fast action. Not only was the Christchurch killer able to livestream the massacre on Facebook, but the video was still widely available on other platforms hours after the attack.

More news: Holika Dahan 2019: Wishes, images, greetings, WhatsApp stickers, quotes & GIF images
More news: Supernatural Isn't Immortal After All, And Will End With Season 15
More news: ‘Bizarre new low’: Republicans outraged after Trump claims he approved McCain’s funeral

Facebook's AI failed to detect live video of last Friday's New Zealand mosque attack because the system simply isn't smart enough to recognize a mass shooting filmed by the attacker, the social network says.

In a Wednesday statement issued by Guy Rosen, Facebook's VP of integrity, the company outlined numerous actions it is undertaking after video of the shootings that killed 50 people showed up on the platform.

The social network has been criticized for the failure of its AI technology to detect the video automatically. YouTube did not disclose the exact number of videos it removed, but said it was in the tens of thousands.

Facebook said in total the video was seen about 4,000 times, with the first report coming in 29 minutes after the video was posted. "This is different from official terrorist propaganda from organizations such as ISIS [Daesh] - which while distributed to a hard core set of followers, is not rebroadcast by mainstream media organizations and is not re-shared widely by individuals", Rosen said. "A delay would not help address the problem due to the sheer number of videos", Rosen said. "More importantly, given the importance of user reports, adding a delay would only further slow down videos getting reported, reviewed and first responders being alerted to provide help on the ground", he wrote.

Share