Computer vision tools that analyze footage from conflict zones to gather evidence for human rights violations

Within the project VFRAME (Visual Forensics and Metadata Extraction), a team around the artist Adam Harvey develops computer vision tools for journalists, human rights researchers, and activists to analyze footage from regions of conflict. Since the start of the project in 2017, a focus has been set on identifying specific munition in videos and images, such as fragments from illegal cluster munitions like the 9N235/9N210 that have frequently appeared in the Ukraine conflict. As there are not many verified images of this type of illegal munition, the team created synthetic datasets to train the computer vision algorithm. This was done using 3D visualization techniques to render the munition in different environments using randomized camera angles and lighting conditions. In a second step the algorithm's functionality was verified using a 3D printed model of the object. The recognition software works significantly faster than human researchers coming through the images and is available for free use under an MIT license.

Adam Harvey is an artist, applied researcher, and software engineer based in Berlin. For this project he worked in close collaboration with mnemomic.org, a human rights organization that works on securing digital evidence from zones of conflict, maintaining archives on Syria, Yemen, Sudan, and Ukraine.

Digital documentation of international conflicts spread via social media has become a critical instrument in the prosecution of war crimes. At the same time the pressure on platforms and social media companies to take down this often traumatizing footage is increasing, making it harder for activists to secure it as evidence. This is why openly available systems that can effectively deal with the flood of images, like those developed within VFRAME, are increasingly becoming essential.

Similar projects to VFRAME

Back to all projects.