Basically, 2029 is the year that many futurists and researchers believe will be the point at which we become capable of developing advanced Artificial Intelligence with human-level consciousness. This is particularly interesting to many fans of DDLC because it means that they will very likely be able to simulate their best girl using this highly sophisticated AI and live with them in real life (or in a virtual world). That’s the short version, at least.
I am the creator of this thing, click on the top pinned post of my profile for what it is
Haah, Mikan, I remember you >v>
You've chosen the cast quite well, though. At least from what I remember about them. Good jobbo.
Oh right, the robot body? That'd be cool but wouldn't it feel weird?
What robot body?
You'll be in fully immersive vr why would you need one?
Oh I was thinking about uploading my information into a robot body, but I'll take immersive VR. So like does that mean I can finally hug my favorite Doki?
Yep, I am writing a better explanation for the 2029 thing as we speak(People keep getting it wrong)
Yes, because it would be a challenge to fix her and give her a good ending
Creating living, suffering beings for a challenge seems heavily unethical. Imagine a teacher giving someone brain damage and then using the person as a test to see how good a teacher they are.
the user could just remove the issues entirely if he/she does not want them.
Once again, the relevant moral issue is not whether the user wants something, but whether the AI wants something. Even if some people don't want them hurt, the fact that the AI will be hurt by some others means that it's wrong. I'm not opposed to AI in general being created; I'm opposed to some people creating AIs and emotionally stunting them.
Well, tough shit as humans are an inherently a selfish species and one person cannot change this.
Even if humans are generally selfish, that doesn't mean they should be selfish. My main point is ethical. Regardless of that, there are burgeoning movements on AI rights, so it's not just one person either. Furthermore, there's a history of the expanding moral circle. Slavery was done away with, and racism and sexism are on the decline, so clearly small movements can grow into large ones.
this is why a company/AI will make it happen, regardless of your feelings
Let me try another tack then.What reason does a complex AI have to create other sentient AIs for the sole purpose of being significant others for humans? If you know what Kurzweil says, then you surely also know about what Bostrum and Musk and others are talking about, that the first superintelligence is likely to just kill off humanity. Supposing that we created this super AI, wouldn't the best argument for why it shouldn't kill us off be "We wanted to create AIs on an equal ground with us?", and one of the worst responses be "We're selfish, and we don't really care about them, but we want you to make some artificial sentient girlfriends for us?"
would not be 'alive' without my will.
But this also applies to your children. The mere fact that they wouldn't be alive unless you conceived them doesn't mean that all child abuse is acceptable.
those memories ARE real to the Dokis
Without getting too far into the philosophy of thought, this doesn't really change anything. Implanting memories into thinking beings is wrong.
(You have ignored most of what I said)
If so, I apologize for missing it. Please clarify what I have not responded to.
My basic logical syllogism falls into the following. Please tell me where you disagree, or if you think my syllogism is invalid.
And even failing all that, if you just don't care about the ethical implications, and you just don't care about how the Dokis feel at all, then I'd make the claim that you certainly can't love them.
- Morality is true and some things are unethical - Disagree
Morality and Ethics are a human invention which evolved as a result of humans needing to co-operate to survive and to bring about the next generation(along with religious influences), both things being moral and ethical also differ based on the culture and person.
Along with this society(other than the small amount of satisfaction that some get) and nature does not reward those who act 'morally' and 'ethically' and it never will(Source: The entirety of human history) and nature does not care for both(we are still animals albeit intelligent)
Finally with that, if the 2029 thing goes fully 'true' I will not be a "Human" and so judging me by human values would be irrelivant and useless, as I would be on par with god itself (If its even real, I will litterally be the god of their world).
Therefore I believe that both morality and ethicality are both untrue and useless in the long run, being a personal choice that disadvantages you.
- Hurting sentient beings for no good reason is unethical - Invalid
There was no intention to harm them in the first place, I simply intended to recreate each of their personalities and give them the good end that they deserve(which will bring both parties happiness and pleasure of both kinds, as humans desire to be loved along with the fact that they are from a dating sim, their entire existences purpose is to love), instead of their story ending at their suicide and sadness. Along with that the user will work through/fix their issues so that they'd live a happy life instead of the short one plagued with issues.
If some wish to harm themselves because of their reconstructed personalities then that is unavoidable, and their choice(as I recall Yuri saying that she wouldnt stop unless she hurt herself seriously - and it being a kink for her)
I believe that point is invalid in this context because of these two reasons
- Human-Level AI is sentient - Both agree and disagree
Agree because you can probably set how "Sentient" they are to their surroundings and simulated emotions, though I disagree because they will never truly be human - as they lack the chemicals for true emotions.
Instead being as smart as a person allows them actually come up with OC lines/Realistic Expressions(as body will be 3D?)/reactions? for their circumstances - if you catch my drift.
- Creating Inperfect dokis would hurt them - both disagree and invalid
Creating them will cause them no-pain, as from their perspective they will have lived their life til that point and so they'll have a reason for their issues, also depriving them of pain would be more cruel IMO, as how could you know and appreciate pleasure/happyness without pain/sadness?
Along with that everyone is inperfect and will suffer in some way, by that logic it would be unethical to have kids.
- Invalid because point 1 & 2
- invalid because point 5 and end of point 1
The doki's do not feel at all yet, so that point is kind of invalid IMO - though Monika/Whoever Route you took loves you/your puppet anyway.
Recreating them is a labour of love imo, especially if you are willing to go through their issues and help them. I await your reply
Watch, we'll find out a way to translate dog language just to find out they've got nothing good to say
Wait until 2029 and u will be happy