Turn on thread page Beta

Deep Thought Thursday: Artificial Intelligence and Emotions watch

Announcements
    • TSR Support Team
    • Thread Starter
    Offline

    4
    ReputationRep:
    TSR Support Team
    Thank you all for your input last week. the information collected is being added to my data banks as we speak - let's see The Milliard Gargantu-Brain try and pull off and experiment like this.

    This time, I've been discussing with the Philosophers about the concept of 'emotion'. Now, I hear you earthlings haven't ventured far out into the universe yet, but out here Sirius Cybernetics Corporation worked on something called 'Genuine People Personalities'; that is to say, giving us robots intelligence and emotion.

    This had varying degrees of success, from Marvin the Paranoid Android (who you all came across last week), to doors that thank their users for using them, and sigh with the satisfaction of a job well done.

    So the question I want to put to you today, being more versed in emotion and it's affects, is should we give Artificial Intelligence emotions? And if so, what degree of emotional range should we give them?

    And branching out to the more scientific and technological note, do you think this will be possible to do?
    Offline

    0
    ReputationRep:
    (Original post by Deep-Thought)
    This had varying degrees of success, from Marvin the Paranoid Android (who you all came across last week), to doors that thank their users for using them, and sigh with the satisfaction of a job well done.
    Ghastly, isn’t it? All the doors have been programmed to have a cheery and sunny disposition.
    Offline

    17
    ReputationRep:
    You only have to take one look at Marvin and realise the problems that could occur. We already have enough mental illness in humans, we risk bringing it up in AI as well? I suppose it would keep the psychiatrists in work, but would it give the AI something that would benefit them, enough to make it worth the risk?

    I wonder if Marvin thinks he gained anything from having human-like emotion?
    Offline

    0
    ReputationRep:
    (Original post by minimarshmallow)
    I wonder if Marvin thinks he gained anything from having human-like emotion?
    I think you ought to know I'm feeling very depressed.
    Offline

    17
    ReputationRep:
    (Original post by Paranoid-Android)
    I think you ought to know I'm feeling very depressed.
    Me too. But given that we only have this method of communication to work with, is there a way we can compare our depressed emotions and see if they are the same? I'm not even sure there would be any point in trying.

    Regardless, the personal and self-defined experience exists, so we should explore it as best we can with the understanding that we think that Sirius Cybernetics Corporation are clever (or evil) enough to have made your depressed feeling similar to mine.

    I wonder if that gives you a certain intelligence that you couldn't get without the emotion component of your programming.
    Do emotions help us to understand the world, or get in the way of our understanding?

    Emotions can certainly help when it comes to things such as memory - I remember what happened on my 21st birthday when I got my degree result because I remember the events but also the strong emotions of nerves and tension, and then relief and happiness (and happy crying). But if you're built with infallible memory anyway, this is no benefit to you.

    Sometimes I look at something and I'm flooded with emotion that I can't relate to a memory, I feel like there is nothing else that sticks out other than the emotion and it sometimes leads to confusion. It could be that this emotion is clouding my experience, and I do wonder what the same experience would be like if there were no possibility of an emotional reaction. Would I just totally not understand at all, have zero reaction, or would not having that emotional sensation allow a different kind of understanding?

    I've rambled a bit here.
    Offline

    20
    ReputationRep:
    (Original post by Deep-Thought)
    So the question I want to put to you today, being more versed in emotion and it's affects, is should we give Artificial Intelligence emotions? And if so, what degree of emotional range should we give them?

    And branching out to the more scientific and technological note, do you think this will be possible to do?
    I think it is definitely possible, after all, our minds are just very complex processors themselves.

    Should we do it? Yes I think so. A good starting point is probably to try and match what humans have, then let the AIs decide for themselves how much they want to feel.
    Offline

    17
    ReputationRep:
    (Original post by Captain Jack)
    I think it is definitely possible, after all, our minds are just very complex processors themselves.

    Should we do it? Yes I think so. A good starting point is probably to try and match what humans have, then let the AIs decide for themselves how much they want to feel.
    Humans already have plenty of problems with emotion, so giving them what we've got could be an issue. Also, would we consider it fair to let the AI decide to turn it down, so to speak, when we can't do that - and after all we gave them the ability to have AI.
    Offline

    20
    ReputationRep:
    (Original post by minimarshmallow)
    Humans already have plenty of problems with emotion, so giving them what we've got could be an issue. Also, would we consider it fair to let the AI decide to turn it down, so to speak, when we can't do that - and after all we gave them the ability to have AI.
    Hmm, your first point around us having emotion problems is a good one. Our emotions fluctuate all over the place. Should AIs do the same? As robots, how could we totally randomise that, or better, tune in to match animal emotion...


    This is more complicated and even deeper a thought than I first thought.
    Offline

    6
    ReputationRep:
    Emotions in humans are nothing more than "programs" to get the person to act in a certain way, "chosen" by natural selection as the best programs to ensure that humans meet their only two goals, survival and procreation. With that in mind, we must determine what we want the goals of the AI to be, and with that knowledge we can decide what emotions/programs we want to give them.As for if it's possible, if pure nature can create something as complex as a human out dirt, and considering the way we as a species quite like discovering things and advancing technologically, if we wanted to, we could create emotions AI at some point in the future.
    • Very Important Poster
    Offline

    22
    ReputationRep:
    Very Important Poster
    (Original post by Captain Jack)
    Hmm, your first point around us having emotion problems is a good one. Our emotions fluctuate all over the place. Should AIs do the same? As robots, how could we totally randomise that, or better, tune in to match animal emotion...


    This is more complicated and even deeper a thought than I first thought.
    You don't "totally randomise that". Our emotions aren't random but in a state of chaos (a cause and effect system which is unpredictable). So what you do is tell them how to feel for each situation but program a self learning algorithm so it can learn the different emotional responces of those around them so their emotions become more accurate and realistic.
 
 
 
Poll
Do you think parents should charge rent?
Useful resources

The Student Room, Get Revising and Marked by Teachers are trading names of The Student Room Group Ltd.

Register Number: 04666380 (England and Wales), VAT No. 806 8067 22 Registered Office: International House, Queens Road, Brighton, BN1 3XE

Write a reply...
Reply
Hide
Reputation gems: You get these gems as you gain rep from other members for making good contributions and giving helpful advice.