The tech giant came under criticism after video of the attack, which killed 50 people, was livestreamed and widely circulated on its platform.
Speaking to ABC News' George Stephanopoulos on Thursday, the Facebook founder and CEO admitted that the artificial intelligence it uses to filter harmful content failed to flag the video, adding that its reach may have been limited if livestreams were subject to a broadcast delay.
"But it would also fundamentally break what livestreaming is for people. Most people are livestreaming a birthday party or hanging out with friends when they can't be together," he said. "One of the things that's magical about livestreaming is that it's bi-directional, … you're not just broadcasting, you're communicating, and people are commenting back. So if you had a delay [it] would break that."
Although reluctant to introduce a delay on Facebook's livestream feature, Zuckerberg accepted that the company needed to work harder to "mitigate and remove as much of the negative (content) as possible."
He also told ABC that the way the company was run had significantly changed in recent years, with the policing of harmful content among the major issues Facebook was focusing on.
"Ninety-nine percent of the ISIS and Al-Qaeda content that we take down are AI systems identifying the move before any person sees it — so that's a good example of being proactive, and I think what we should hold all companies to account [for]," he said.
Footage of last month's mosque massacre was livestreamed on Facebook by the shooter last month.
On Thursday, Australian lawmakers passed legislation that could see social media executives face jail time and hefty fines if their platforms fail to remove violent content.