A week after news broke of multiple videos of suicides posted on Facebook remaining on the site for hours, the company has announced a new plan to hire 3,000 more people for its operations team to screen for harmful videos and other posts to respond to them more quickly in the future.

Mark Zuckerberg, the CEO of Facebook said that this would be in addition to the 4,500 people already working in this capacity.

It’s a big hiring move, but is it enough? The company currently has close to 2 billion users — we’ll be getting an update on that number later today when Facebook posts its quarterly earnings — and Zuckerberg said that there are “millions of reports” received every week.

“If we’re going to build a safe community, we need to respond quickly,” Zuckerberg wrote in a post earlier today. “We’re working to make these videos easier to report so we can take the right action sooner — whether that’s responding quickly when someone needs help or taking a post down.”

The move to add more human curation into the mix is nevertheless a step in the right direction. To date, the company has put more emphasis on building algorithms and mechanisms for people to report on friends or even themselves if they are concerned. In March, it launched a new set of suicide prevention tools. A month later it new tech to combat revenge porn.

While the reviewers will be given a role to speed up some of the bottleneck that results in the gap between reporting content and that content being taken down, Facebook said it will continue to work with authorities and also continue to invest in — yes — more technology.

“We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help. As these become available they should help make our community safer,” Zuckerberg wrote.

More to come.

Featured Image: Jaap Arriens/NurPhoto/Getty Images