Fast Company, June 27 2016: Putting Artificial Intelligence On The Hunt For Poachers

June 27, 2016

 

Putting Artificial Intelligence On The Hunt For Poachers

To protect endangered species, an algorithm used to stop terrorists is put to work protecting the jungle.

BY BEN PAYNTER

The problem of how to defend a country changes when your attacker isn’t acting rationally. Terrorists put their causes above their home country and don’t necessarily fear death or retaliation. So shortly after 9/11, Milind Tambe, a professor of computer science and engineering at USC, proposed a radical new style of protection: Why not use artificial intelligence to make your own targets harder to attack? By matching predictive algorithms with machine learning and some massive processing power, you could create a computer program capable of figuring out how to deploy limited security forces around sensitive places most effectively. The trick would be for those schedules or formations to remain unpredictable. “If you play rock-paper-scissors you don’t want to play one strategy or your adversary can defeat it. You want to randomize. That’s the logic here,” Tambe says.

Over the last decade, Tambe and his team have joined with USC’s National Center for Risk and Economic Analysis of Terrorism Events and partnered with the Department of Homeland Security and Army Research Office to rewire how several major law enforcement agencies operate. In 2013, he also co-founded ARMORWAY, a private company that continues to refine some breakthroughs for clients, including the Los Angeles Unified School System, and recently closed a $2.5 million round of funding. But it’s not all security: Earlier this month, Tambe headlined at a Whitehouse workshop called Artificial Intelligence for Social Good.

To that end, Tambe’s team is moving on to even more complex scenarios. They recently tested a new program dubbed PAWS (Protection Assistant for Wildlife Security) to help animal rights organizations battle poaching in places like Uganda and Malaysia. PAWS compares animal paths with things like terrain shifts, park borders, access roads, and known poaching hotspots to create smarter patrol maps over rough terrain. During an evaluation period, field agents came across about 30% more signs of human activity, like makeshifts camps, marked trees, or a discarded lighter. For that work, he recently earned the Association for the Advancement of Artificial Intelligence’s Innovative Application of AI Award.

The way it works–for both poachers and criminals–is rooted in software called Assistant for Randomized Monitoring Over Routes (ARMOR). One of its uses was in setting up checkpoints at intervals along the eight roads leading into Los Angeles’s LAX airport. When it rolled out in 2008, the Los Angeles airport police’s annual arrest rate for drugs and miscellaneous crimes in that area more than quadrupled, then maintained that rate for two more years before arrests started to decline. Since then, Tambe has launched a schedule and tactical optimization program for federal air marshals, the LA Sheriff’s Department commuter train police, and a system dubbed PROTECT—that’s Port Resilience Operational/Tactical Enforcement to Combat Terrorism—a Coast Guard initiative that changes waterborne patrol patterns to scout docks and guard moving ships. For commuters from Jersey, this should help explain why the patrol boat escorting the Staten Island ferry often zigzags unexpectedly.

Rebuffed evildoers don’t exactly stick around to be audited, so Tambe’s team can’t track how many terrorist attempts they’ve deterred. Still, they have their own simulations—and copious academic studies—that show success. In some cases, they’ve hired mock attackers to try to penetrate their robo-defense network. In others, more telling trends emerge. Those LAX stops proved effective at finding firearm violations. Of course, protecting complicated targets with limited resources means leaving small windows of exposure. Randomization can also mean repeating the same coverage in one area a few times in a row because if someone thought you were rotating such a move might still seem unexpected. That’s the sort of move that machines grasp better than people. “Being random is not natural,” Tambe says.

Have something to say about this article? You can email us and let us know. If it’s interesting and thoughtful, we may publish your response.

 

Read Fast Company article

See also: 2016