Prosecutors told to verify ‘deepfake’ evidence that could let criminals off hook

Criminals could soon use deepfake technology to disguise themselves in CCTV footage, experts have warned.

It comes after prosecutors were told to ensure criminals aren't let off the hook with digitally tampered photographic evidence.

Law associate Julija Kalpokiene told the Daily Star Online: "That could well be specially when all the surveillance systems are inter-connected.

"A cyber-criminal may be able to tweak the systems so the surveillance would not show who is the real criminal."

The intelligence property expert warned we are "several years" from the technology becoming available to the general public.

She said that it could spark a cat-and-mouse game with police struggling to keep up with cyber criminals using AI systems to learn how to get around security protections.

  • Daily Star's newsletter brings you the biggest and best stories – sign up today

She spoke after Director of Public Prosecutions Max Hill QC this week warned "technological advances are accelerating in almost every area".

He said: "The criminal justice system, including the CPS, may soon have to routinely be able to distinguish between a real voice recording or video and a 'deep fake'."

Ms Kalpokiene, of Jurisconsultus law firm in Kaunas, Lithuania, said: "Deep fake technology – it's Artificial Intelligence, it's machine-learning and it can learn from detectives' software and improve its algorithm so it's a constant race between the detection and the technology that tries to hide the fakes.

  • AI deepfakes pose 'serious threat to democracy as everything can be denied'

"At the moment it's still at its infancy but with technological advances we are talking about several years' time rather than decades until deep fake technology might be available to everyone on the street.

"It's important to educate society at large, also CPS staff and judges, to raise awareness about how these new technologies can be used.

"We are speaking about people's freedom but also in civil cases I see that this might be an issue in future where evidence might not be trusted because of deep fakes.

  • Digital photography expert claims 'horse has bolted' on realistic deep fake editing

"In divorce cases, say for example one wants custody of the children and to tarnish the partner.

"With every technology there are people that use it for good purposes and there are people who use if for their own good."

President of the Lithuanian Young Bar Association Dr Edvinas Meskys, 35, warned deepfakes raised the risk of nuclear wars and fuel conspiracy theories.

  • Deepfakes soar during coronavirus crisis as criminals 'easily create Zoom footage'

He said: "Another thing would be if it's the president of the US and the video says, 'let's attack North Korea'.

"This would be the worst thing and it might raise a counter-reaction and it would be too late.

"The Prime Minister could be in deepfakes saying something – if people would see this as real it might impact their behaviour.

"We tried to analyse what it would be if someone said later this was not true – for most people they say, 'maybe it was true, maybe it was rumours'."

He added: "The topic is very relevant in the age of technology, as the line between real and fake is getting thinner.

"This means that while the technologies on creating the deep faked videos become more acceptable, it is just the matter of time when it will be hard to prove the source.

"This calls for immediate reaction and need for anti-deep fake solutions in order ensure the validity of the source, as such deep fakes could raise threats in democratic processes and be used for illegal activity."

Prosecutors are already dealing with deepfake cases in revenge porn where a person's face is superimposed on a pornographic photo or video.

Last month Facebook said it had shut down a new attempt by Russia's Internet Research Agency to meddle in US and UK politics via AI-generated photos.

Source: Read Full Article