Using AI and Film to Track Tear Gas Use Against Civilians

2019/08/02 Innoverview Read

Filmmaker Laura Poitras’ Oscar-winning documentary Citizenfour centers on the pale, calm face of National Security Agency leaker Edward Snowden holed up in a plain Hong Kong hotel room. Her latest star is even more impassive—an aluminum tear gas grenade—but it's seen amid surreal pulsating color.

The 10-minute short, Triple Chaser, is named for that tear gas canister, a branded product of Florida’s Safariland Group that human rights groups say has been used by US border agents against migrants in Tijuana and by Israeli forces in Palestine. The film, a collaboration with the nonprofit Forensic Architecture, documents how that organization is devel­oping machine-learning software that could help uncover where Triple Chaser canisters are being launched or thrown.

It also accuses prominent arts donor Warren Kanders, whose investment company acquired Safariland in 2012, of profit­ing from state attacks on civilians, including with lethal force. Forensic Architecture’s software and Poitras’ film were created to protest his presence on the board of the Whitney Museum of American Art. Kanders did not respond to a request for comment Wednesday, and on Thursday he announced he was resigning from the board.

The London group set out to build its Triple Chaser detector after being invited to participate in the Whitney’s prestigious Biennial show. “We decided that we would only participate if we could investigate,” says Eyal Weizman, Forensic Architecture’s director and founder. The Whitney did not object, and the group subsequently teamed up with Poitras and her company, Praxis Films.

The resulting film debuted at the Whitney Biennial in May and was released online Thursday, here on WIRED and on Forensic Architecture’s website. However, Forensic Archi­tecture announced last weekend that it would withdraw its work from the show, to protest Kanders’ presence on the board. (Seven other artists have announced plans to with­draw their own works for the same reason.) In a statement, Whitney director Adam Weinberg said the gallery respects the right of artists to express themselves as they see fit. “While the Whitney is saddened by this decision, we will of course comply with the artists’ request,” he said.

A mesmerizing sequence at the film’s heart strobes through images that Forensic Architecture used to train machine-learning software to recognize Triple Chaser grenades. As Richard Strauss strings swell and David Byrne narrates, the viewer is assaulted with surreal images of Triple Chaser canisters against colorful spots, swirls, and Mondrian rectangles.

To train its software—and make those images—Forensic Architecture gathered about 130 photos and videos of the canisters that were visible online, from 14 countries. Training an image-recognition system typically requires thousands of images, so the group used the real images and product specifications to create a photo-real 3D model of the grenade. That in turn was used to make thousands of synthetic images in which canisters sit alone or in piles on high-contrast patterns, as well as in simulated environments including city streets, forest floors, and mud puddles.

Using synthetic data to train machine-learning algorithms is an increasingly standard technique in AI projects. Amazon used it to develop the software that tracks shoppers in its checkout-free stores. Putting objects of interest against artificial backgrounds, as Forensic Architecture did, helps ensure the software learns to fixate on the intended subject and ignore the surroundings.

When Poitras saw some of Forensic Architecture’s striking synthetic images during a visit to the group’s London base last year, she felt compelled to give them a broader audience. “I like the fact that they weren’t made kitsch and psychedelic for human purposes but to give information to the machine-learning classifier,” she says.

1564713523294294.jpg

Machine-learning training sets are crucial to corporate technology projects such as self-driving cars or selfie-enhancing software that helps sell $1,000 smartphones. The US military is investing in machine learning to process drone and satellite surveillance imagery. The film gives viewers a sense of how AI practitioners feed their algorithms in such projects—and a reminder that AI has uses outside of chasing profits or enhancing military power.

Forensic Architecture was already known for using software for human rights research, activism, and art. By combining images and video, often posted online by witnesses, into 3D models, the group has reconstructed events such as a suspected Syrian chemical weapons attack it investigated with The New York Times, or the death of man hit by a tear gas canister in the West Bank. The group sometimes presents its findings as artworks, but it has also presented them in courtrooms and to the United Nations.

Forensic Architecture turned to machine learning to more quickly identify and gather video posted online by witnesses of conflicts, protests, and other events. Thanks to the spread of smartphones, incidents that once went undocumented are now captured from countless angles. Gathering this material, dubbed “open source” in a phrase borrowed from intelligence work, has given human rights organizations a powerful new lens on the world—but the volume has become overwhelming.

“At first you would put in some keywords and there would be three or four videos,” says Weizman. “Now we’re swimming in an ocean of images.” Forensic Architecture started experimenting with machine learning to accelerate the work of gathering and screening material.

Its first AI project began last fall. It trained a machine-learning model to look for tanks in online images or video, to help chart Russian military activity in Eastern Ukraine. The results were presented to the European Court of Human Rights as part of a case against Russia and Ukraine.

Forensic Architecture also created an open source tool called mtriage that can crawl YouTube to download videos matching certain keywords and dates. It extracts frames from the videos so they can be passed through a machine-learning model to highlight material for closer analysis.

The Triple Chaser–detector is also intended to plug into mtriage. The software has proven capable of flagging images showing Triple Chaser canisters that it hasn’t seen before, although the team is still working to improve its reliability.

Poitras’ film ends with new allegations against Kanders, based on evidence gathered by a researcher she brought into the collaboration with Forensic Architecture. It claims that Sierra Bullets, a subsidiary of Clarus Corporation, a public company where Kanders is executive chairman, sells bullets to IMI, a supplier to the Israeli military. The film—and subsequent research by Forensic Architecture—suggest Sierra’s ammunition may have been used against civilians in deadly incidents last year that a UN investigation concluded could have been war crimes. Sierra Bullets, Clarus Corporation, and IMI did not respond to requests for comment.