- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
In June, the Texas Department of Public Safety (DPS) signed an acquisition plan for a 5-year, nearly $5.3 million contract for a controversial surveillance tool called Tangles from tech firm PenLink, according to records obtained by the Texas Observer through a public information request. The deal is nearly twice as large as the company’s $2.7 million two-year contract with the federal Immigration and Customs Enforcement (ICE).
Tangles is an artificial intelligence-powered web platform that scrapes information from the open, deep, and dark web. Tangles’ premier add-on feature, WebLoc, is controversial among digital privacy advocates. Any client who purchases access to WebLoc can track different mobile devices’ movements in a specific, virtual area selected by the user, through a capability called “geofencing.” Users of software like Tangles can do this without a search warrant or subpoena. (In a high-profile ruling, the Fifth Circuit recently held that police cannot compel companies like Google to hand over data obtained through geofencing.) Device-tracking services rely on location pings and other personal data pulled from smartphones, usually via in-app advertisers. Surveillance tech companies then buy this information from data brokers and sell access to it as part of their products.
WebLoc can even be used to access a device’s mobile ad ID, a string of numbers and letters that acts as a unique identifier for mobile devices in the ad marketing ecosystem, according to a US Office of Naval Intelligence procurement notice.
Wolfie Christl, a public interest researcher and digital rights activist based in Vienna, Austria, argues that data collected for a specific purpose, such as navigation or dating apps, should not be used by different parties for unrelated reasons. “It’s a disaster,” Christl told the Observer. “It’s the largest possible imaginable decontextualization of data. … This cannot be how our future digital society looks like.”
Don’t worry, it’s AI. It won’t work properly.
This might be good for the false negatives but not for the false positives.
Texas law enforcement doesn’t need a language model to blame false positives on. They can false positively shoot whomever they want with no reprocussion.
It can not work properly in the wrong way though
That is the more likely scenario sadly, and the cops will say “don’t blame us those kids got killed, we just listened to the AI” and some judge will say “yea, that’s ok . the officer did what he was told like he is supposed to, unlike that bus of kindergarteners who didn’t lie down and put their hands behind their heads when commanded”
But after they install cameras everywhere and after a bunch of misses despite surveillence on everyone they actually get someone the amount of self back patting and chest puffing will be off the charts.