One of the lessons of the last few weeks of lies and insanity from the left is that America’s law enforcement community, which puts life and limb on the line every day to keep the country’s citizens safe, has few friends among the country’s political and media establishment — and even fewer in the Big Tech hubs of Silicon Valley and Seattle.
Spurred on by Antifa and Black Lives Matter’s two weeks of hate, rioting, and violence, IBM, Microsoft, and Amazon have all announced that they will no longer allow American police departments to use their facial recognition technology. As the country descends into lawlessness, Silicon Valley wants to make the job of law enforcement harder.
One could argue that there’s an upside to this; there are obvious concerns with facial recognition technology, which remains largely unregulated. Silicon Valley has already built a privatized version of China’s totalitarian social credit system, in which users of tech platforms are awarded secret scores for good behavior, and secret penalties for bad behavior (like having the wrong political views). And now they’re building the face-tracking technology to go with it!
But that point is moot — the tech companies boycotting law enforcement haven’t said they’ll stop work on facial recognition technology altogether. They’ve just said that the police can’t use it. If they want to put your face in their “hate agents” databases, so that their cameras can detect when a heretic is in the vicinity, nothing is stopping them.
The lesson for America’s law enforcement extends to more than just facial recognition. It is this: the police cannot rely on hyper-leftist technology companies.
If a police department uses an Amazon service, or Microsoft, or IBM, or Google, or Facebook, they have no way of knowing whether their access will suddenly be removed because the company’s blue-haired social justice warrior employees had a hissy fit. Imagine, if you will, an Amazon-designed police drone that gets switched off and falls out of the sky in the middle of an operation because an executive thousands of miles away suddenly felt the need to virtue-signal.
In the case of facial recognition, police departments are in luck. As it is an emerging technology, the market is not yet completely dominated by the established leftist companies. NEC, Ayonix, and Clearview AI have all said they will continue to work with law enforcement.
You may have heard of Clearview before — it has faced an unrelenting barrage of negative press, in large part because its CEO, Hoan Ton-That had the audacity to associate with conservatives. Much like Oculus founder Palmer Luckey, who was fired from Facebook after being outed as a Trump supporter, Ton-That discovered that you aren’t allowed to be a whizz-kid tech entrepreneur if you have the wrong political opinions. Peter Thiel only gets away with it because he started early and is smarter than the competition.
There are legitimate security and privacy concerns around Clearview, as there is with all facial recognition technology. Nobody likes the Minority Report notion of an all-seeing eye following them wherever they go. But while that’s a solid argument against the industry as a whole, it’s pretty clear why left-wing journalists are gunning for Clearview more than IBM, Microsoft, or Amazon. If conservatives are allowed to develop the technology, reason leftists, they could do crazy things — like work with the police!
As the American left grows ever more extreme, police departments will have to think more clearly about the political culture of companies they do business with. The big tech companies are now radicalized against the police. And if they don’t trust the police, the police shouldn’t trust them.
Are you an insider at Google, Reddit, Facebook, Twitter, or any other tech company who wants to confidentially reveal wrongdoing or political bias at your company? Reach out to Allum Bokhari at his secure email address allumbokhari@protonmail.com.
Allum Bokhari is the senior technology correspondent at Breitbart News.
COMMENTS
Please let us know if you're having issues with commenting.