youtube anti-terrorism campaign
Reuters

Following the move of Facebook to help fight against terrorism, YouTube has announced that it is doing its fair share to combat online terror.

Google announced on Sunday that YouTube will have an additional implementing rules and regulations. These policies are specifically designed to filter contents that incite dread amongst audiences. In a blog post written by Google vice president and general counsel Kent Walker, he outlined "four steps...to fight online terror".

In the first two points, Walker revealed that videos identified to be blatantly encouraging terrorism will automatically be taken down. The executive, however, expressed that this move is not as easy as it sounds. An hour of video is uploaded to YouTube every second so tracking down its content is a real challenge for the platform.

"This can be challenging: a video of a terrorist attack may be informative news reporting by the BBC, or glorification of violence if uploaded in a different context by a different user," writes Walker. But with the help of their advanced machine learning, hunting down extremist propaganda could be much easier.

While artificial intelligence might a huge help, Walker stressed out in his second point the contribution of humans to flag down inappropriate contents. Currently, the platform is using a video analysis software and human content flaggers as bases to trace contents that do not follow the standards. Google is adding 50 experts from non-governmental organisations to review contents.

In the third step, Walker revealed that they will be running after content creators who are publishing "videos that contain inflammatory religious or supremacist content". While they are not against YouTube's policies, they will not be taken down; instead, these kinds of contents will be hidden so it would be hard to find.

The final step uses "targeted online advertising to reach potential ISIS recruits" and the lead them to anti-terrorism contents in hopes to change their stance.

Apart from Facebook and YouTube, Twitter also suspended nearly 400,000 accounts that were suspected of supporting terrorism.