Mark Zuckerberg used his biggest press event of the year to briefly address an uproar over a murder video posted to Facebook.
"We have a lot of work, and we will keep doing all we can to prevent tragedies like this from happening," Zuckerberg said on stage at F8, Facebook's annual developers conference.
"Our hearts go out to the family and friends of Robert Godwin Sr.," Zuckerberg added. He opened his keynote with jokes about the Fast and the Furious. He spoke briefly about building community before addressing the Cleveland murder.
On Sunday, a Cleveland man posted a video to Facebook (FB) of him shooting and killing Godwin., a 74-year-old grandfather who was on his way home from an Easter meal with family.
Just an hour before the conference kicked off, news broke that the murder suspect had been found dead of a self-inflicted gunshot wound.
The murder video stayed up for more than two hours on Sunday before it was removed by Facebook, according to a timeline later shared by the company. The delay reignited criticism over Facebook's handling of offensive content.
"We know we need to do better," Justin Osofsky, VP of global operations at Facebook, wrote in a post Monday.
Related: Facebook on murder video: 'We know we need to do better'
It's just the latest in a growing list of disturbing videos of murder, suicide, torture and beheading published on Facebook, however briefly, either through live broadcasts or video uploads.
A source close to Facebook says it has "thousands" of people reviewing content around the world. Once a piece of content is reported by users as inappropriate, it is typically reviewed "within 24 hours."
But some have criticized Facebook for making users its first line of defense.
"It's actually the users who are exposed to something that they find disturbing, and then they start that process of review," says Sarah T. Roberts, an assistant professor at UCLA who studies online content moderation.
In the case of the most recent murder video, nearly two hours passed before users reported it on Facebook, according to the company. Facebook disabled the account behind the video 23 minutes after that.
In a lengthy manifesto about the future of Facebook published in February, Zuckerberg acknowledged "terribly tragic events -- like suicides, some live streamed -- that perhaps could have been prevented if someone had realized what was happening and reported them sooner."
Zuckerberg said Facebook is developing artificial intelligence to better flag content on the site. This system "already generates about one-third of all reports to the team that reviews content," according to Zuckerberg's post.