I stumbled across the #deepnostalgia hashtag on Twitter this morning and was instantly curious, so went to take a look at what it was all about. Deep Nostalgia is offered by the online genealogy website MyHeritage, and uses Artificial Intelligence licensed from D-ID to create the effect that a still photo is actually moving.
But Deep Nostalgia can take photos from any camera and seemingly bring them to life.
The program has a store of pre-recorded driver videos of facial movements and it takes uploaded photographs and applies the one that works best for the photo in question. MyHeritage’s declared intention is to allow users upload photos of deceased loved ones and see them in “action”.
There have been many debates around the ethics of this kind of work for years now, particularly in the movies, where dead actors have been brought back to the screen and animated via a motion captured performance. But few people realise just how common this has been in the last twenty years or so. The late Marlon Brando was CGI’d back into his role as Jor-El for Superman Returns in 2006 and more recently Peter Cushing was brought back to life to “reprise” his role as the character Moff Tarkin in the Star Wars spin off film Rogue One.
And, of course, there was the famous Galaxy chocolate advert featuring a digital 1950s Audrey Hepburn at her loveliest.
Questions about the use of CGI in this way, and about integrity of such performances, have been mulled over on social media too. Actors Chris Evans and Elijah Wood spoke out against the resurrection of James Dean in 2019 for the movie Finding Jack and while I think their objections are valid for the use of CGI to recreate on-screen performances, I don’t think the Deep Nostalgia is really something affected by most of those issues.
There is some crossover in the deception that Deep Nostalgia commits though because in applying pre-recorded facial movements to the face on an old photograph it may appear to have brought that person “back to life” but it is no more that person than the CGI James Dean is the actual 1950s film icon.
Nuances of character, personality and movement are unique to an individual and all the software is doing is applying someone else’s nuances to a recognisable face. Julie Ann Emery sums up the issues succinctly on Twitter…
Yeah, that's not James Dean.
It's his face on a motion capture performance and an "anonymous" actor providing voice pattern and choices.
I'd like to know how it will be credited.
How the real actors will be paid.
And how little this team understands the acting craft https://t.co/MkIQHrB5Y0
— Julie Ann Emery (@julieannemery) November 6, 2019
For me, the stated purpose of recreating the dead to this extent for nostalgia seems a bit creepy and I can’t see what it adds. It may offer comfort to someone who misses a deceased love one but I think people are allowing themselves to be deceived into thinking the moving, smiling, blinking video is the actual person.
I don’t think that these issues compare to, say, facial reconstructions of dead people though. There’s no attempt to create a personality or choices in something like that – and there is historical value in knowing what people looked like many years ago.
I suppose what MyHeritage are doing is a bit of fun. It’s a clever innovation but in itself doesn’t really add anything beyond nostalgia and whimsy. That said, here’s Richard III, animated from the facial reconstruction created when his skeleton was found in 2012…
… and here is a picture of my great grandfather, John James Sewell. He was born 1898, a Victorian, and he served in the trenches in the Great War. He died in 1966 and I never met him but I will contradict everything I’ve just typed above and admit to feeling a bit of curiosity in seeing a moving image of him – even if he has been dead for fifty five years.
My Heritage is an on-line genealogy website and requires a free sign up before you can use their Deep Nostalgia service.