Failure to take white supremacist clips off YouTube has resulted in MPs slamming Google's handling of violent and extremist content.
It comes after YouTube failed to remove four versions of a Neo-Nazi video by National Action for more than a year after politicians flagged them.
MPs branded the video-sharing platform the company owns as 'utterly hopeless' over its failings to tackle far-right posts.
They hit out at the 'frankly astonishing' low levels of staff employed directly to deal with moderating content and criticised the company for not having anyone in the team based in the UK.
Labour’s Yvette Cooper, who chairs the committee, said: 'Google’s response just isn’t good enough.
'This incredibly rich and powerful global company has a huge responsibility to stop its platforms being used for crime, extremism and damage to young people.
'Yet in most cases it doesn’t even employ its own staff to work on tackling illegal or abusive content, it contracts the problem out to others instead.
'And it also turns out that none of those specialist reviewers are based in the UK at all.'
YouTube works with more than 4,000 contractors to review content and employs 200 staff directly.
The figures were set out in a letter from William McCants, global leader for counterterrorism at YouTube, who apologised to the committee at a hearing in March after it raised concerns that four versions of a National Action video were found on the site more than a year after being flagged up by MPs.
In the letter, Mr McCants said four reviewers who made 'incorrect decisions' on the white supremacist films were 'removed from live reviews' and will be more closely monitored after retraining.
Ms Cooper said there had been 'no grip' on contracting or training arrangements.
She added: 'If lack of directly employed staff in the UK explains why YouTube were so utterly hopeless at removing banned National Action videos it proves they need to think again.
'We raised those illegal videos repeatedly over twelve months with Google and YouTube top executives, yet we still found them on the platform.
'Google has already admitted to us that their content moderators weren’t sufficiently sensitive to far right extremism and terror threats in the UK. Now we learn why, if none of them are based here.
'Given that preventing illegal activity online should be a huge part of YouTube and Google’s central corporate purpose, it is frankly astonishing that less than five per cent of those working on content moderation decisions are actually employed directly by the company.
'At a time when we know far-right extremism is on the rise in the UK, online companies have a responsibility to act proactively and decisively to do all they can to ensure extremist views are not given a platform, rather than responding only after the negative publicity of a public hearing.'
This article has been adapted from its original source.
© Associated Newspapers Ltd.