A friend once told me about a man in Seoul who lost his daughter at age 7.
Years later, he put on a VR headset and there she was again. Laughing. Running toward him. Calling him “dad.”
For him, it was healing. For many who watched, it felt unsettling.
And that’s the paradox of AI resurrection.
We now have the ability to create digital avatars of people who’ve passed away.
Not just static photos, but interactive versions that text, speak, even offer advice.
On one hand, it’s comforting. Imagine hearing your grandmother’s voice again or asking your late mentor for guidance.
On the other hand, it’s haunting. Who decides what this digital version says? What if it feels “almost right” but not quite?
Here’s the thing. Technology is moving faster than our ability to answer the ethical questions.
Is this remembrance, or is it exploitation?
Is it keeping memories alive, or is it keeping grief from healing?
Like many AI debates, there isn’t a simple answer. But ignoring the question isn’t an option either.
What do you think? If you had the chance, would you want a digital avatar of someone you lost?


