The US technology giant on Monday outlined four new steps that are designed make the YouTube owner “part of the solution” to talking extremist content online.
Item one on the to-do list is to commit more engineering resources to develop artificial intelligence software that can be trained to identify and remove content.
Humans will though remain part of the operation with Google increasing the number of independent experts in YouTube’s ‘Trusted Flagger’ programme.
YouTube also proposes to block monetization of video content containing inflammatory religious or content and it is working with a company called Jigsaw which uses ad-targeting to direct anti-terrorist content to potential ISIS recruits.
"Terrorism is an attack on open societies, and addressing the threat posed by violence and hate is a critical challenge for us all," said Kent Walker, Google general council.
He added: “Extremists and terrorists seek to attack and erode not just our security, but also our values; the very things that make our societies open and free. We must not let them."