A specialist anti-terror team has registered a jump in reports relating to extreme right-wing material online.

More than one in four referrals about web content from the public are now made over far-right concerns.

The national unit has instigated the removal of just over 304,000 online videos, web pages and posts in total since it was established in 2010.

Police said the vast majority of the content relates to Islamist extremism.

But the Counter-Terrorism Internet Referral Unit has seen an increase over the past year in material related to extreme right-wing ideology.

Figures released by Scotland Yard show that of 1,849 public referrals to the CTIRU since March last year, 481, or 26%, related to extreme right-wing material.

Senior officers have warned that the threat from both Islamist and far-right terrorism is growing.

Concerns over the availability of terror-related and extremist content on the web intensified after Britain was hit by a string of atrocities.

All five attacks on UK soil last year had an "online component", and tech companies have come under sustained pressure over their efforts.

The CTIRU works with hundreds of organisations to remove content including propaganda and recruitment videos, images of executions and speeches calling for racial or religious violence.

Figures released on Friday show the unit has also provided key evidence and information in 226 counter-terror investigations since its launch.

The team's main function in previous years has been getting content taken down but officers have been able to shift their focus more towards investigative work as tech providers step up their efforts to remove material automatically.

In February Mohammed Kamal Hussain, 28, from east London, was jailed for encouraging terrorism and supporting a proscribed organisation as a direct result of a public referral that came into the CTIRU.

Metropolitan Police Commander Clarke Jarrett, head of the Counter Terrorism Command, said: "This example shows how the public has a vital role to play in the fight against terrorism.

"I would urge anyone who comes across what they think could be extremist or terrorist material - whether it's right-wing, Islamist or indeed any other form of extremism - to report it to us.

"Our specialist officers can then assess the material and take appropriate action to potentially prevent further terrorist activity from happening."

A number of companies have detailed the steps they are taking to clamp down on terrorist content.

In the second half of last year Twitter suspended 274,460 accounts for violations related to promotion of terrorism.

The site said 93% of the suspended accounts were flagged by its "internal, proprietary spam-fighting tools", and 74% were removed before sending their first tweet.

YouTube has introduced "machine learning" to help identify extremist and terror-related material, removing more than 150,000 videos between June and December.

Facebook has deployed artificial intelligence as part of its efforts, with 99% of Islamic State and al Qaida-related content removed before it is flagged by users.

By Hayden Smith