Monday, January 09, 2023

AI After Death: interactions with AI representations of the deceased

 I want to pass on to MindBlog readers the following excellent notes that Terry Allard made to guide a discussion at the Nov. 29, 2022 session of the Chaos & Complex Systems Discussion group at the Univ. of Wisconsin. 

Chaos & Complex Systems Discussion

AI After Death: interactions with AI representations of the deceased
November 29, 2022
Source material: Washington Post; Nov 12, 2022, by Caren Chesler: AI’s New Frontier
https://www.washingtonpost.com/health/2022/11/12/artificial-intelligence-grief/ See also https://www.media.mit.edu/projects/augmented-eternity/overview/

AI companies have begun mining digital content and real world interview to create AI representations of people with whom their survivors can interact.

The digital representations are created from social media posts, email, electronic surveillance, voice recordings and sometimes actual interviews with the targets before they pass away.

The interaction can be made directly with visual, audio or text avatars.

  • The documentary, “Meeting You,” created a digitized re-creation of a recently lost child that the mother could see through a virtual reality headset.

  • Augmented Eternities (MIT Media Lab) This project uses a distributed machine intelligence network to enable its users to control their growing digital footprint, turn it into their digital representation, and share it as a part of a social network.

    Our digital identity has become so rich and intrinsic that without it, it may feel like a part of us is missing. The number of sensors we carry daily and the digital footprints we leave behind have given us enough granular patterns and data clusters that we can now use them for prediction and reasoning on behalf of an individual. We believe that by enabling our digital identity to perpetuate, we can significantly contribute to global expertise and enable a new form of an intergenerational collective intelligence.

    https://www.media.mit.edu/projects/augmented-eternity/overview/

  • Amazon unveiled a new feature it’s developing for Alexa, in which the virtual assistant can read aloud stories in a deceased loved one’s voice

  • Several entrepreneurs in the AI sphere, including James Vlahos of HereAfter AI and Eugenia Kuyda, who co-founded AI start-ups Luka and Replika, have turned their efforts toward virtual representations of people, using data from their digital footprint to craft an avatar or chatbot that can interact with family members after they’ve passed.

    HereAfter’s app takes users through an interview process before they’ve died, prompting them to recollect stories and memories that are then recorded. After they’ve passed, family members can ask questions, and the app responds in the deceased’s voice using the accumulated interview information, almost like it’s engaging in a conversation.

    Some Questions for Discussion:

  1. How does posthumous interaction benefit the survivors? Are there risks? Could it lead to someone wanting to remain in this virtual world of their loved one?

  2. Could posthumous digital avatars have a therapeutic benefit for the grieving?

  3. Can digital avatars replace human interaction writ large?

  4. Can digital avatars learn and evolve on their own?

  5. Are digital avatars alive or could they be? How do we define sentience?

  6. Will “deep fakes” compromise trust in online person-to-person interactions?

  7. Can people download their identities into digital form and transcend (cheat) death?

 

No comments:

Post a Comment