Interesting. I guess it’s always only been a question of time, they can reconstruct actors and singers pretty accurately.
I too am on the fence. The risk is it would become an addiction, like you all have mentioned. But then again, it wouldn’t BE that person, just a reflection of sorts, so I wonder how much you would notice that. Look at AI-generated images. They are amazing, but they lack something. I’m not sure something computerised can ever be conscious the way a living creature can. But I guess it depends on what consciousness is and we don’t really know that.
For me personally, I already worry about strange things, like how can I be sure it’s them if we meet again and not just a figment of my imagination. I don’t think I’d feel comfortable with AI, but I would probably still give it a go.
Just struck me. Maybe it wouldn’t be addictive because you’d always feel something was missing and so only be frustrated?
@Ulma I don’t think it’s there yet - the AI machine learning itself is always going to be about the amount and quality of the information fed into it. The more and accurate the info, the more the AI can learn to emulate that person. It will need a lot of info to get it almost perfect to point you can believe you’re actually talking to your partner. It’s going to take more than few text messages and video clips.
For people in our position we are going to have recreate a lot of that.
The digital avatars and voice cloning as a somewhat physical representation, isn’t there yet for you to be able to fully fool yourself into believing it’s the person - but it’s getting closer and closer.
I do think by end of this year, early next year it will have reached that point
I’ll also probably end up trying it when it reaches point where I think it could possibly emulate my partner with a high level of accuracy
Whether it’s a good idea or a pandoras box, I guess time will tell
Hi , reading about this looks good on paper , of course we all would want to see and talk to our partners , but for me , it wouldn’t be enough , I want to be able to hold my husband to kiss him to be intimate with him , to just feel his presence next to me . But if it helps someone , living / existing in this grief .then yes it could be helpful. I think it would totally mess my brain up . But that’s just me .x
@MemoriesOfUs my point about consciousness wasn’t relating to the AI being actually conscious but how we would go about recreating the simulacrum of consciousness in an avatar representing a deceased person. I don’t really think that would be possible given that there would be very limited resources to rely on. I would struggle to confidently describe the philosophical view points of my wife, I struggle to articulate my own tbh, our identity is a very complex malleable beast. Another way of thinking about this for me is would I be able to recreate an accurate representation of my own self? I doubt I would and that’s with having access to all the primary resources that I need.
And as for AI gaining consciousness, I think that probably will happen, especially with the advent of quantum computing. It may just be a machine learning program based on mathematics, but in the end we’re just a bag of chemicals and it’s happened for us. But even given that I still don’t think it would be possible to recreate someone’s personality digitally, especially if the person is deceased. As you say there would be a definite bias in anything that was used as ‘fill in’ so I feel the results would be very different from the person we knew.
Yes im on the fence too, its an interesting mind boggling concept.
I miss my husband so much, the physical side and the talking side. I often think maybe if i could just have conversations with him it would give me so much comfort and would almost be like a long distance relationship or would that just mess with my already messed . But my hubby was so unique and unconventional i dont think this AI thing would be able to replicate/work out what he would say.
Very thought provoking x
@Walan it’s not consciousness as such - it’s if you had a conversation with the AI, would it be accurate enough or capture their personality to the extent you could believe you’re talking to your deceased partner
The more info fed into the AI, the closer it becomes to emulating that person.
It never will be conscious, it’s just an illusion
If an illusion is convincing enough, it can become reality
For me, it would be a terrible idea; I wouldn’t want my memories of him replaced by an avatar. And I wonder what John would think; I know he’d hate it. I think - for me - it would be very selfish, trying to cling onto him when he has gone, not accepting the dignity and completeness of his death.
@MemoriesOfUs I understand that it’s not the creation of consciousness, as I’ve outlined previously that’s never been my point. Instead, as you say, it’s the creation of a simulacrum of consciousness that would be required for this to be effective. My argument is that I don’t think this would be achievable as we have no consensus on how consciousness works and therefore don’t have an accurate working model to build from.
If we have no accurate working model then the amount of data we feed in becomes irrelevant. This is particularly the case when the end goal is to create the illusion of a specific personality that can respond in real time convincingly enough to feel like the actual person is present.
@Walan the realistic interactions already exist - the world of AI companionship had global investment of USD 50 billion in 2023
Specific traits and characteristics are chosen and a AI companions or partners are generated. These are learning models, so will learn according to the interactions
If it didn’t simulate interactions and real time responses realistically enough, there wouldn’t be that level of investment
You don’t have to look too deeply to see the number of these available
Digital resurrection is also not a new concept, albeit a niche market, - Joshua Barbeau used Project December in 2020 to create a digital version of his deceased girlfriend using gpt3 from text messages, which was used to gain some sense of closure and the interactions were considered highly successful
GPT4 and 4.5 are vastly superior language models than GPT3, and GPT5, which has proposed release early next year, will be beyond comparison
It will never be a perfect recreation, but with enough accurate learning material fed into the AI, it would be more than enough to create the illusion
While these are capable of emulating a human companion or consciousness they are not emulating a specific personality, my argument really relates to recreating a specific personality that can be emulated in real time which is an order of a different magnitude. Creating a computer simulation that can be interacted with and convince people that it is human has been achievable for years, but creating a simulation of a specific personality is something I feel is beyond our reach at the moment. Yes the technology exists but the understanding of consciousness doesn’t.
I feel that what would result in much of this is an ersatz copy of the personality we are attempting to recreate, that it may be approaching it but will always be slightly different.
In the case of Joshua Berbeau he admits himself that intellectually he knew it wasn’t his late girlfriend but emotionally he chose to believe in it, as you say for the purposes of using it as a tool to overcome his grief. Project December only really covered the written output of his late girlfriend in text message format in recreating the personality of his late girlfriend. A fully interactive visual avatar is something way beyond texting. Written styles are something that can be analysed and reproduced, I don’t think that an increase in the power of AI will mean a subsequent ability to reproduce personalities that someone who has intimate knowledge of that person would be effectively fooled by for very long. Perhaps an increase in the power of AI will lead to discoveries that solve the problems of consensus around human consciousness but if not I can’t see how things progress without solving this problem first. We’re more than the sum of the parts of our experiences and interactions, and at present we don’t understand why.
As you say “it will never be a perfect recreation” and that for me is where it falls down, it will always be an illusion where we have to choose to believe and so part of us will always know that it isn’t real.