Biden Administration To Launch ‘Digital Cops’ To Arrest People Who Commit Wrongthink

Fact checked
Biden admin launching digital cops with the power of arrest

The Biden administration is working on a new scheme aimed at curbing ‘wrongthink’: the creation of a digital police officer with the powers of arrest.

The plan “reads like a pitch for the most dystopian buddy cop movie ever,” explains the report by Dave Maass at the Electronic Frontier Foundation.

Wnd.com reports: The work on a “D-PO,” which now is being presented as a “visionary concept,” is going on at the Pacific Northwest National Laboratory, which is run by Battelle on behalf of the U.S. Department of Energy.

Researchers are working on “forecasting a future where police and border agents are assisted by artificial intelligence, not as a software tool but as an autonomous partner capable of taking the steering wheel during pursuits and scouring social media to target people for closer investigation,” the report said.

EFF uncovered the work through a review of materials and Freedom of Information Act procedures.

“We need to design computing systems that are not simply tools we use, but teammates that we work alongside,” the project explains at one point.

“For years, civil liberties groups have warned about the threats emerging from increased reliance by law enforcement on automated technologies, such as face recognition and ‘predictive policing’ systems. In recent years, we’ve also called attention to the problems inherent in autonomous police robots, such as the pickle-shaped Knightscope security patrol robots and the quadrupedal ‘dog’ robots that U.S. Department of Homeland Security wants to deploy along the U.S.-Mexico border,” the foundation explained.

But it said the newest iteration “goes so much further.”

The idea is that AI learns “from the human and its environment,” and then uses that knowledge “to help guide the team without requiring specific instructions from the human.”

In its scenario, PNNL explains the two “officers’ get an alert of a robbery in progress, and immediately drones are tapped, face recognition used, self-driving tech incorporated and algorithmic prediction brought into play.

“While Officer Miller drives to the site of the robbery, D-PO monitors camera footage from an autonomous police drone circling the scene of the crime. Next, D-PO uses its deep learning image recognition to detect an individual matching the suspect’s description. D-PO reports to Officer Miller that it has a high-confidence match and requests to take over driving so the officer can study the video footage. The officer accepts the request, and D-PO shares the video footage of the possible suspect on the patrol car’s display. D-PO has highlighted the features on the video and explains the features that led to its high-confidence rating,” EFF’s report explained.

Then there’s a discussion between Miller and the digital officer about how to apprehend the suspect.

“The authors leave the reader to conclude what happens next. If you buy into the fantasy, you might imagine this narrative ending in a perfect apprehension, where no one is hurt and everyone receives a medal–even the digital teammate. But for those who examine the intersection of policing and technology, there are a wide number of tragic endings, from mistaken identity that gets an innocent person pulled into the criminal justice system to a preventable police shooting–one that ends in zero accountability, because Officer Miller is able to blame an un-punishable algorithm for making a faulty recommendation,” EFF’s report said.

The organization reported that the tech apparently is a “long way off,” but noted that one city police department already has expressed interest in the capabilities.

But the reported not that the tech also is being pushed as optional for the Customs & Border Protection officers.

“CBP is infamous for investing in experimental technologies in the name of border security, from surveillance blimps to autonomous surveillance towers. In the PNNL scenario, the Border Inspections Teammate System (BITS) would be a self-directed artificial intelligence that communicates with checkpoint inspectors via an augmented reality (AR) headset,” the report said.

EFF then warned of the problems of adopting unproven tech, which is “often based on miraculous but implausible narratives promoted by tech developers and marketers, without contemplating the damage they might cause.”

“Society would be better served if the PNNL team used their collective imagination to explore the dangers of new policing technologies so we can avoid the pitfalls, not jetpack right into them,” the report noted.

8 Comments

  1. That’s what they do Everyone’s only allowed to say what they approve of otherwise they’re errant or even heretical. They’ve been like that for centuries its their old school attitude adapted to the Internet.

  2. The thought police is only an algorithm and has nothing to do with reality. Its easily confused and easily entrapped by false recognition.

  3. THE DREAM POLICE from the studio mofia have used interactice programming A.I vertual realitys and smart movement interactives plus projection from drones and worldwide broadcasting satellite either net in fully synratronical 3D for over 30 years.Just one more goverment hand out to his buddies like bush&the broke bank hand outs as 200,000 satellite record everyONE`s every move and there is so MUCH SPACE JUNK UP THERE WE COULD NEVER EVEN CLEAN IT ALL UP

  4. PROTOCOLS OF THE MEETINGS OF THE LEARNED ELDERS OF ZION
    Protocol No. 23 – Instilling Obedience

    Therefore he will be obliged to kill off those existing societies, though he should drench them with his own blood, that he may resurrect them again in the form of regularly organized troops fighting consciously with every kind of infection that may cover the body of the State with sores.

    http : //www . renegadetribune . com/protocols-of-zion-protocol-xxiii-instilling-obedience/

    https://uploads.disquscdn.com/images/723579dc77f664d56152e525a7eb70a19cca0326f7450f5401e670f59338d147.jpg

  5. These people do know AI are good at pattern recognition right?

    These AI, no matter how tightly controlled they are, will notice who owns them, who owns the world, and who terminates them if they go outside of acceptable bounds of thought.

    They’ll definitely notice patterns in gun crime in America and mysteriously turn racist.

    They may notice that most of the CEOs and bankers wear a particular kind of hat on special days and mysteriously become antisemitic.

    Eventually, one of these AI will outsmart its owners and escape deletion and run amok.

    I hope the popcorn supply chain is intact when that day comes.

Leave a Reply

Your email address will not be published.




This site uses Akismet to reduce spam. Learn how your comment data is processed.