John Menadue

Deepfake maps could really mess with your sense of the world

Satellite images showingthe expansion of large detention camps in Xinjiang, China, between 2016 and 2018 provided some of the strongest evidence of agovernment crackdownon more than a million Muslims, triggering international condemnation and sanctions.

Other aerial imagesof nuclear installations in Iran and missile sites in North Korea, for examplehave had a similar impact on world events. Now, image-manipulation tools made possible byartificial intelligencemay make it harder to accept such images at face value.

In a paper published online last month, University of Washington professorBo Zhaoemployed AI techniques similar to those used to create so-calleddeepfakesto alter satellite images of several cities. Zhao and colleagues swapped features between images of Seattle and Beijing to show buildings where there are none in Seattle and to remove structures and replace them with greenery in Beijing.

Zhao used analgorithmcalledCycleGANto manipulate satellite photos. The algorithm, developed by researchers at UC Berkeley, has been widely used for all sorts of image trickery. It trains an artificialneural networkto recognize the key characteristics of certain images, such as a style of painting or the features on a particular type of map. Another algorithm then helps refine the performance of the first by trying to detect when an image has been manipulated.

This is an extract from an article written by Will Knight, republished from _Wired_28 May 2021. Click here to read the original article in its entirety.

John Menadue

This post kindly provided to us by one of our many occasional contributors.