Internet companies should face a tax punishment for failing to deal with the threat of terrorism in the UK, security minister Ben Wallace has said.
Wallace said firms such as Facebook, Google and YouTube were too slow to remove radical content online, forcing the government to act instead.
While tech firms were "ruthless profiteers", governments were spending millions policing the web, he added.
Tech firms have called on governments to help them remove extremist content.
In an interview with the Sunday Times, Wallace said tech giants were failing to help prevent the radicalisation of people online.
"Because content is not taken down as quickly as they could do," he claimed, "we're having to de-radicalise people who have been radicalised. That's costing millions."
He said the refusal of messaging services - such as WhatsApp, which is owned by Facebook - to give the security services access to message data was "turning the internet into an anarchic violent space".
"Because of encryption and because of radicalisation, the cost of that is heaped on law enforcement agencies," Wallace told the newspaper.
He said "the time for excuses is at an end" and the government should look at "all options" of incentivising firms - "including tax".
"If they continue to be less than co-operative, we should look at things like tax as a way of incentivising them or compensating for their inaction."
"They will ruthlessly sell our details to loans and soft-porn companies but not give it to our democratically elected government."
'Further and faster'
In September, Prime Minister Theresa May called on tech giants to end the "safe spaces" she said terrorists enjoyed online.
Technology companies must go "further and faster" in removing extremist content, she added.
Google, Facebook and YouTube are yet to respond to Wallace's remarks.
However, speaking in September, Kent Walker, general counsel for Google, said tech firms would not be able to "do it alone".
"We need people and we need feedback from trusted government sources and from our users to identify and remove some of the most problematic content out there."
Facebook and Twitter said they were working hard to rid their networks of terrorist activity and support.
YouTube told the BBC that it received 200,000 reports of inappropriate content a day, but managed to review 98% of them within 24 hours.