...That’s what Progeny claims it can do: take an existing drone, like the hand-held Raven, and turn it into a TTL machine. “Any pose, any expression, any lighting,” Faltemier says. Progeny needs an image with just 50 pixels between the target’s eyes to build a 3D model of his face. That’s about the same as what it takes to traditionally capture a 2D image. (Naturally, the model gets better and better the more pictures are taken during enrollment.) Once the target is “enrolled” in Progeny’s system, it might only take 15 or 20 pixels to identify him again. A glance or two at a Raven’s camera might conceivably be enough.
And if the system can’t get a good enough look at a target’s face, Progeny has other ways of IDing its prey. The key, developed under a previous Navy contract, is a kind of digital stereotyping. Using a series of so-called “soft biometrics” — everything from age to gender to “ethnicity” to “skin color” to height and weight — the system can keep track of targets “at ranges that are impossible to do with facial recognition,” Faltemier says. Like 750 feet away or more.
But if Progeny can get close enough, Faltemier says his technology can even tell identical twins apart. With backing by the Army, researchers from Notre Dame and Michigan State Universities collected images of faces at a “Twins Days” festival. Progeny then zeroed in on the twins’ scars, marks, and tattoos — and were able to spot one from the other. The company says the software can help the military “not only learn the identity of subjects but also their associations in social groups.”
The Pentagon isn’t content to simply watch the enemies it knows it has, however. The Army also wants to identify potentially hostile behavior and intent, in order to uncover clandestine foes.
Charles River Analytics is using its Army cash to build a so-called “Adversary Behavior Acquisition, Collection, Understanding, and Summarization (ABACUS)” tool. The system would integrate data from informants’ tips, drone footage, and captured phone calls. Then it would apply “a human behavior modeling and simulation engine” that would spit out “intent-based threat assessments of individuals and groups.” In other words: This software could potentially find out which people are most likely to harbor ill will toward the U.S. military or its objectives. Feeling nervous yet?
“The enemy goes to great lengths to hide his activities,” explains Modus Operandi, Inc., which won an Army contract to assemble “probabilistic algorithms th[at] determine the likelihood of adversarial intent.” The company calls its system “Clear Heart.” As in, the contents of your heart are now open for the Pentagon to see. It may be the most unnerving detail in this whole unnerving story.
Obviously, this will work just as well as the precision surgical strike drones have in IraqAfghanistan. We never have killed a single non-combatant with the drones we have already, just ask the Peace Laureate or any other Company representative. To think otherwise clearly marks your adversarial intent.