AI, Neural Nets and the Decriminalization of Video Evidence
We are in the midst of an industrial revolution, one even larger in scale than any in history. Historians generally agree that our would has experienced three great and distinct industrial revolutions: the Mechanical Revolution that came as a result of the steam engine, the Revolution of Science and Mass Production, and more recently, the Technological Revolution. This very moment, we are experiencing the mere start of the Information Revolution.
Jake. Image: thispersondoesnotexist.com
This is Jake. Jake looks like a normal guy. He may even look vaguely familiar to you. Jake, however, does not exist. He was never born and has never breathed. Jake is the result of a general adversarial network, or GAN. Simply put, a GAN is the competition of two neural nets, powerful, adaptive artificial intelligences, being used to improve itself with every use.
In a way, GANs are similar to two people playing chess. As one player tries new tactics and positions, the other learns more about the game, its rules, and about the opponent. Every time the website is refreshed, thispersondoesnotexist.com will show a generated image created entirely from scratch by an AI system that has been fed millions upon millions of portraits of humans. In the GAN system of machine learning, one "half" of the AI will either generate an image or pull one from file, and the other "half" will guess whether the image was generated or real. The more runs, the better the system of generation (and guessing) gets.
Thispersondoesnotexist.com displays only the AI generated images. One may understandably unsettling, though it may be hard to explain precisely why. It is hard to imagine that Jake truly never existed, given how real he looks. Try looking deeply into Jake's face. You may be surprised by an unexplainable lifelessness in his eyes. Something is clearly not right with Jake, but it is nearly impossible to say what.
In the last decade, we have seen the steep and exponential climb of technological capabilities. Artificial intelligence has helped us greatly with our strides to create new technologies. The difference in RAM between my father's high school computers and mine saw an increase of over thirty-three million percent. We have seen computers in the last thirty years shrink from large, clunky, and bulky beasts, to tiny Raspberry Pi and Arduinos that can be as small as a potato chip.
Consequently, it is only rational to expect that this steep development will continue at least beyond the near and foreseeable future. Thispersondoesnotexist.com has been around for nearly two years now, which is an eternity for Silicon Valley and the technology sector. We are quickly nearing the achievement of a GAN that can create indistinguishably high-detailed and realistic fakes, and I would reason that this feat would have been achieved much sooner had there been more push for it.
Naturally, the next step after mastering realistic image generation is realistic video generation. This already exists in some forms, though at the present time, the flaws appear much more noticeably in a video than they do in pictures. It is far from unreasonable to assume that hyper-realistic genuine-augmented or otherwise procedurally generated videos are very close to our grasp.
In the past decade or two alone, our societies have seen a distinct shift towards favoring skepticism. We have developed the skill of assessing how realistic a piece of media or news is. For a long time, people were likely to believe anything they saw in videos or images posted online. Videos and pictures were largely held as fact, as unbound proof. Now, we are much more wary due to the increase of technology's ability to alter, let alone generate new media altogether.
Now, one must consider the ramifications of such a technology. In the event that we really attain the capability to fake videos, seamlessly swapping faces or generating videos of fictional events altogether, how will this impact our society?
I predict that within the next 10-15 years, a large and high-profile court case will emerge in which video evidence is used as proof to incriminate someone, potentially someone notable or prominent, of a ludicrous crime. The case will be disputed in the public arena as some will claim that the video evidence was faked or altered while others will claim it is legitimate. This case will throw everyone in disarray and set a dangerous precedent for case law.
Suppose the court rules that modern technology has developed so well that the evidence in question cannot be given consideration in the ruling, as it could just as well have been forged. What then? What does this mean for all of criminal justice proceedings? The Internet of Things is becoming increasingly well-connected, video cameras dot almost every store front, ATM, office, factory, and now even homes and neighborhoods. More video is being taken per second than was collectively taken in certain decades of the twentieth century. Were this future ruling to discredit video evidence as proof, all of these cameras would, in many ways, become irrelevant. What is the point of having security cameras if video footage loses the capability to be incriminating?
I have spent the past few days in a mild, genuine worry about this topic. I really cannot think of any possible solutions. When technology gets so phenomenally powerful that videos can be forged and realistically edited seamlessly, it is inevitable that a court case will arise, questioning the authenticity of a certain piece of video or image evidence. Once courts start ruling that the likelihood of faked images and videos is too high, the precedent will exist and the argument to omit photographic or video evidence from a case will be sound.
I emphasize the word inevitable. I really do not see a way that technology can continue to progress as it has and is without this instance coming about in some shape or another.
If you have any thoughts, solutions, or theorizings about this hypothetical future in which video and photographic evidence holds no incriminating value, feel free to email me about it. I'd love to hear what other people think.