Facebook has struggled for a long time with violence going viral through its live-streaming service. To curb terrorism and mass hysteria, Facebook is working with the police department and their provided footage of shootings.
This step occurred to improve the social network’s ability to detect live streaming of terrorism and potentially alert officers about an attack sooner.
Facebook will provide officers at the Met’s firearms training centers with body cameras, in a bid to help its artificial intelligence accurately and rapidly in identifying videos of real-life first-person shooter incidents.
These reforms were deemed quite necessary after Facebook came under fire. A live stream video was published showing the New Zealand mosque shootings in March, which left 51 people dead. This video was viewed less than 200 times during its live broadcast and was watched about 4,000 times in total before it was removed.
Facebook primarily relies on AI to spot violating or inappropriate content and remove it as quickly as possible. Though in the case of the Christchurch terrorist attack, it just did not have enough first-person footage of violent events for the system to match it up against.
The tech giant has partnered with police in the UK and US, who are sending in first-person perspective videos of firearm training sessions. Facebook will develop an algorithm that can detect live streams posted by shooters, according to the UK’s Metropolitan Police Department.
Facebook released a statement regarding this issue saying, “With this initiative, we aim to improve our detection of real-world, first-person footage of violent events and avoid incorrectly detecting other types of footage such as fictional content from movies or video games.”
Facebook approached the Met police with the idea, and the officers will start providing footage in October. The Met police said US law enforcement would also provide shooting footage to Facebook without specifying the agency.