A deepfake video created by Dutch police could help to change the often negative perception of the technology.
Deepfakes use generative neural network architectures – such as autoencoders or generative adversarial networks (GANs) – to manipulate or generate visual and audio content.
The technology is already being used for malicious purposes including generating sexual content of individuals without their consent, fraud, and the creation of deceptive content aimed at changing views and influencing democratic processes.
However, authorities in Rotterdam have proven the technology can be put to use for good.
Dutch police have created a deepfake video of 13-year-old Sedar Soares – a young footballer who was shot dead in 2003 while throwing snowballs with his friends in the car park of a Rotterdam metro station – in an appeal for information to finally solve his murder.
The video depicts Soares picking up a football in front of the camera and walking through a guard of honour on the field that comprises his relatives, friends, and former teachers.
“Somebody must know who murdered my darling brother. That’s why he has been brought back to life for this film,” says a voice in the video, before Soares drops his ball.
“Do you know more? Then speak,” his relatives and friends say, before his image disappears from the field. The video then gives the police contact details.
It’s hoped the stirring video and a reminder of what Soares would have looked like at the time will help to jog memories and lead to the case finally being solved.
Daan Annegarn, a detective with the National Investigation Communications Team, said:
“We know better and better how cold cases can be solved. Science shows that it works to hit witnesses and the perpetrator in the heart—with a personal call to share information. What better way to do that than to let Sedar and his family do the talking?
We had to cross a threshold. It is not nothing to ask relatives: ‘Can I bring your loved one to life in a deepfake video? We are convinced that it contributes to the detection, but have not done it before.‘
The family has to fully support it.”
So far, it seems to have had an impact. The police claim to have already received dozens of tips but they need to see whether they’re credible. In the meantime, anyone that may have any information is encouraged to come forward.
“The deployment of deepfake is not just a lucky shot. We are convinced that it can touch hearts in the criminal environment—that witnesses and perhaps the perpetrator can come forward,” Annegarn concludes.
(Source: AI News)