University of Otago human nutrition and medicine professor and Dunedin Hospital endocrinology consultant Sir Jim Mann’s likeness has been used in online advertisements encouraging people with type-2 diabetes to stop taking their metformin medication, and instead use alternative natural products.
Sir Jim said he was alarmed by the deep-fake video because it used footage of him giving a lecture earlier this year at the Hutton Theatre, and it was made to look and sound like it was on a 1News television clip.
People would easily be fooled by the clip, and when people were being urged to stop taking important medication, it was extremely dangerous, he said.
"You’re not going to drop down dead if you stop taking metformin, but certainly in the long term, it can be detrimental to health — unquestionably.
"Metformin is a very old, very safe, very effective drug still, and those so-called alternatives [in the advertisement] are absolute garbage.
"The other thing is, there was this general message that came through saying there’s all these nasty medicines in general and we should be seeking alternatives that are healthier and natural.
"That’s a very scary message. If there were such things, that would be fantastic, but I know of no alternative medicines that are of any use whatsoever for treating diabetes."
Sir Jim said the thing that frightened him most was how easily many highly intelligent and sensible people had been taken in by the scam.
"Somebody who has known me for 40 years ... wrote to me and congratulated me on this fantastic new advance.
"Somebody once said that if a claim sounds too good to be true, it probably isn’t true — I think that still stands."
Sir Jim said he was "disturbed" at the way artificial intelligence had been able to make his lips look like he was saying things he did not say at the lecture.
"Of course, one hears of it happening to other people, but not until it happened to me did I realise the extent to which the technology could distort the truth."
University of Otago School of Computing head Associate Prof Grant Dick said it was very easy to create deep-fake videos.
"There’s off-the-shelf software that can create moderately convincing deep fakes with almost no effort."
He said the creator could put a voice sample of the person into the software which could impersonate their voice saying a script, and other software could be used to move the person’s lips to match the script.
"It is a cause for concern for its use in society, both in this example and for example in politics as well.
"Imagine using it on the prime minister.
"If it gets to a susceptible audience who are willing to believe this kind of stuff, it can potentially be quite damaging," Prof Dick said.
"It creates significant trust issues."
He said the software did have a "valid use" in movie production.
The movie Star Wars — Rogue One used similar technology to recreate Peter Cushing’s head on a different actor.
"Cushing died years ago, but his character was pivotal to the storyline, so they used a different actor and replaced their head with a recreation of Cushing’s head."
He said there were telltale signs of deep-fake videos.
"It’s a bit like Photoshopped images. In the early days, people were easily fooled by them, but over time, we have become more wary of them and we know what to look for.
"Deep fakes have signatures around them that make it possible to see they are AI generated — like the person in the video doesn’t blink at regular intervals and little misalignments in the corners of the mouth."
Prof Dick said people were now working on technology that would be able to automatically detect deep-fake videos, but part of the problem with videos online was they were low fidelity.
"You don’t need to produce something that is studio filmed for it to be somewhat convincing."
Because the technology was becoming more easily accessible to everybody, he expected there to be more instances of deep-fake videos online in the future.
Sir Jim encouraged people to source reliable information from Diabetes New Zealand or recognised health professionals.