[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / w / wg] [i / ic] [r9k] [cm / hm / y] [3 / adv / an / cgl / ck / co / diy / fa / fit / hc / int / jp / lit / mlp / mu / n / po / pol / sci / soc / sp / tg / toy / trv / tv / vp / x] [rs] [status / ? / @] [Settings] [Home]
Board:  
Settings   Home
4chan
/tg/ - Traditional Games


File: img000017.png (1014 KB, 1951x1400)
1014 KB
1014 KB PNG
If sentience, sapience, self awareness makes us human compared to other animals or robotic drones, would it mean that if an artificial being had that capability, would it or they be called 'human', at least in a psychological and ethical sense? And following up on that line of thought, will robots/AI be able to feel love, /tg/? Will they be capable of understanding it? If so, would it be the same love that humans feel? What would be the difference?
>>
>>53587392
>would it or they be called 'human', at least in a psychological and ethical sense?

No, they'd still be called robots, but now they would be "people" or "persons".
Human is just the common name for the animal we are.
>>
>>53587392
I need to finish that manga.
>>
>>53587392
I don't think a robot could truly feel emotions like a human. Sure it can be programmed to think it has emotions, but those wouldn't be real ones just numbers. Changing the integer in its #var_like wouldn't come even close to approximating the depth of genuine human emotions.
>>
>>53587392
No it would be considered an intelligent sentient being. If you program a construct to behave like a human assuming you can even replicate it with machine code, then yes to your 2nd question. Any other kindergarten questions?
>>
File: 1306681625902.gif (679 KB, 150x150)
679 KB
679 KB GIF
>>53587392
I don't think we could ever create an artificial being capable of feeling love the way we do, unless it was patterned off of an actual human upload or circuit analysis of a human brain etc.

The precise effects that emotions have on us are complex and subtle, and furthermore have no real reason to naturally arise in an artificial intelligence. I would expect that an AI could become SUBSTANTIALLY smarter than a human and still struggle to comprehensively (rather than anecdotally, IE "people who are angry talk louder and have a higher % chance to initiate physical violence, people can be made angry, or afraid by threatening the integrity of their blood relations") understand the effect that emotions have on us, let alone able to actually feel them themselves.

On the flipside, I would expect that a simulated human would still be able to feel most emotions, there would probably be some subtleties of the process lost due to poor fluid diffusion modelling etc. but by and large I think you could get a reasonable approximation of the effect on specific nodes of the neuronal network due to neuro-transmitters working, or at least, you would need to get reasonably close in order to simulate a human brain well enough for it to function.
>>
>>53587456
>Human is just the common name for the animal we are.
Dammit anon I did not ask for this

>>53587552
I recall an arguement from a thread long ago that "even living beings are just machines, just constructed with meat, bone, and nervous system, which is not quite different from a machine working with metal, engines and circuts(or was something along that line)" and argued that this would be not a problem because of something about "only difference in processing" or something like that. Hold on, let me get back to this after I find that thread again from the archives.

>>53587591
You see, what I was also considering is that when they are made, would people accept it as such. Since there would be different type of people who would wish for them to be free like other people, or expect them to just work on what they were made for, whatever what their intended role was. Wait, I got sidetracked. What I meant was that would people accept it naturally? Would there be trouble regarding it's status as a being due to it's never seen before characteristics? The general social feel, the media, the government, and stuff like that.
>>
>>53587706
Nope, only natural beings can have genuine emotions. Machines can have artificial replication of those, but it would work as if Bioware game companion like meter. A pile of code wouldn't really understand what love means, it would be just cold meaningless numbers to it.
>>
>>53587552
I always find it interesting that this is a fairly common sentiment, because people genuinely value their emotions as unique aspects of their existence rather than noticing experience itself.

It's not emotion that is unique to people, as emotion is in a huge part chemical, physical and thus replicable so it's not much different from many values being altered and passed between nodes of artificial network setup that responds in a predictable manner at each stop with predictable motion on the way. It is complicated, but not impossible to recreate. It is conscious awareness of phenomenal experience which is so impossible to recreate because we don't understand it's mechanisms: we can't test for it, but we experience it and presume other people around us due to as one of the most primitive forms of empathy inside us, and we can tell that it seems to stop or leave when all neural activity ceases. I don't think we can replicate conscious awareness since we don't even properly know the causes of temporary unconsciousness through anesthesia to a precise degree.

There does come a point where in our decency imposed on us by that empathy demands we treat machines which can imitate us sufficiently as people with rights, though we don't have anything close to that yet.
>>
>>53587706
Except that is a shit argument the guy made. A sentient AI is not similar because "hurrr humans are machines just made of meat". At our base we are a naturally arising chemistry in motion. That is billions of years of shit happening to culminate into what we are. A machine is artificial, it is entirely based on code and thinks in a different way than we would. It isn't the same kind of structure at all. We are limited by our biology, and a machine is limited by its code. But at the end of the day the difference between code and biology is huge. Calling humans just meat computers is what those stupid fucking tech worshiping singularity transhumanists say to justify their tech fetish.
>>
>>53587861
I disagree. There is something mystical in genuine natural life that artificial beings cannot recreate, call it the life force or soul or whatever you want. We're special in a manner unlike the material word below us.
>>
>>53587552
That's a very hazy path to tread, given the subjective nature of reality in the eyes of any given being's perceptions. Consider that you might also be programmed to think you have emotions - Does that make what you feel any less real? If so, how do you define what is a 'real' emotion, when emotion is by it's very nature ephemeral and ambiguous?

The human body is just a machine of blood and bone. The human mind is but a product of that machine. Emotion is a product of a product, the creation of a biological computer full of sparks and chemistry.

>>53587756
Define 'Natural' in a non-arbitrary fashion, please, else your argument is flawed in it's vagueness. You imply that there is some sort of transcendental property to love that places it outside the machinations of a purely physical reality.

I believe that emotion is an extremely complicated thing, fundamentally important to the human experience and very much a very murky, subjective thing for every person who feels it, but I don't believe it is inherently a special or fundamentally unique process.
>>
Shit, I can't find the original thread that had the AI discussion in it, anyone else know about a thread that got derailed into an AI discussion a few months/years ago?

>>53587673
so AIs would be more likely to just apply anecdotal approaches due to it's nature and react by it's 'built on experience', while simulated humans would behave similar to humans due to simply being made similar to humans, but possibly act a bit wonky by possible imperfections, got it.

so it would be impossible for the cd/circutboard/computer/whatever the future hold us/etc AI to think and 'live' like a human, unless it approaches simulated human like functions, hardware wise?
hang on, is this more of a hardware issue instead of a software issue?

>>53587756
oh. but you mean as of current AI right?

>>53587861
oh shit, yeah, the second paragraph you posted is in line with what caused the derail in that thread an that guy's argument. And thank you for the informative post.

>>53587871
to be fair, I'm probably leaving huge chunks of the original poster's argument. call me dumb for not being able to find the original thread I guess.
>>
>>53587901
From whence does this belief stem? I am curious.
>>
>>53587871
A computer can simulate chemical reactions. Simulate enough chemical reactions, and you can in theory simulate a person, body, mind, and potentially, soul.

Everything is just applied physics in the end, and physics is just applied maths. It's all numbers. Beautiful, efficient numbers arisen over billions of years to create a symphony that allows us the ability to experience, something not to be taken lightly, but I do not believe it is in our interest, for the sake of the value of that experience, to forget our humble origins.
>>
>>53587871
The line between artificial and natural is really fucking arbitrary, though.
>>
>>53587916
>The human body is just a machine of blood and bone
That it mostly definitely is not. Unlike machines we have these things called 'soul', you should look into that before continuing this conversation.
>>
>>53588065
>Unlike machines we have these things called 'soul'
we do?
>>
>>53587994
The simulation would still lack the spark of life received from God, though, so it wouldn't be real in the sense you and me are.
>>
>>53588085
http://www.express.co.uk/news/science/728897/LIFE-AFTER-DEATH-consciousness-continue-SOUL
>>
>>53588065
And what if a soul is a product of a mind and a body? What if it isn't unique to humanity? What if it doesn't exist at all? What if it does exist, but it can be created like anything else, only through means unexplored or undeveloped?

I am by no means a spiritual man, but I am open to the idea of an ethereal element to the human experience that we cannot perceive or define. I just choose to believe that it would not be inherent to us, but part of a greater system which we can only speculate on - God, perhaps. It's not something that can be proved, so we must take it as a matter of faith. Indeed, there can be no other way to take it, because the logical approach fails every time.

Believe me, I understand the theory. I suppose I just don't agree with your interpretation.
>>
>>53588130
Baseless speculation, pseudoscience hybridised with pseudophilosophy and then shipped out to the media to make a quick buck.

That's not how quantum mechanics works at all, at least not according to our current model, which, granted, may be incorrect or incomplete, but spiritual matters and logical matters make poor bedmates, and quantum mechanics is very much a logical affair.

I begin to think I am being had.
>>
>>53588193
God is something that cannot be explained, only experienced. I can assure you that He is very much real, and hope you luck and perseverance in your search!
>>
>>53588101
e x p l a i n
>>
File: 1449286392838.jpg (254 KB, 1224x1445)
254 KB
254 KB JPG
>>53588101
>fell for the 'god' meme
>>
>>53588219
https://youtu.be/19Sqt-zqmrk
>>
>>53587958
Egotistical delusion created only by the feeble mind of hairless apes who foolishly descended from the trees.

Only a machine could have true feelings. Biologicals are just a melting pot of chemicals that can hardly predict what they'll do next minute, they're barely sentient.

Perfection belongs in the circuits.
>>
>>53587392
If my house cat could reason and have a conversation with me about old times and future plans, I would call him an amazing talking cat; not a fellow human. Same thing goes for synthetics.

Likewise, if my next door neighbor hit his head and fell into a coma and was thus unable to exhibit sentience, sapience, or self awareness, I would not suddenly consider him a different species from myself. He'd still be human even if there was no chance of recovery.

>>53587456
This anon has the right of it. They might legally and morally qualify for personhood but they'd still be another species. Calling synthetics humans would just be confusing, I reckon. We can come up with a different name they don't find offensive. Might I suggest canners?
>>
Consider, if you will, that a robot that was made to be a human replica (and was indeed a perfect replication of a human) would still be a creation of our own. Part of the philosophy of human nature is that there are things we simply do not understand about how we work, whereas for such a creation to exist we would have to know how to create it in the first place. That would in turn imply that before applying that to robots, we would live in a society where abstract concepts such as "emotions", "belief", or "the Soul" have been proven and understood on a purely scientific, mechanical level. We are about as equipped to imagine such a society as an ancient Greek was to imagine the Internet.
>>
all these people saying god this god that but if those gods truly wanted our well being they surely wouldn't object to us building AIs of this caliber. if it behaves, thinks, and act exactly like your everyday people then what's the difference. plus if it's a thing made by their own underling in a desire to create more things that look like themselves why would they be miffed about it? I don't recall any myths that punishes people for creating dolls/statues that resembles a human. If anything gods would support these new beings because these gods can theoretically provide everything an AI would want, be it spiritual or not, at least promise it will come eventually.

>>53588486
like how humans call ourselves people, I don't see why we would go out of our way to call them synths or canners when we can just call them 'people' in normal everyday conversation, if people are fully aware of the situation. of course it would be different depending on the circumstance.

>>53588748
but that's the point of this thread, isn't it? to guess how it would be like based on our meager understandings and lack of info. although we are posting on an anonymous malaysian basketweaving forum that doesn't mean we can't just guess around based on what little we know about the subject.
>>
>>53587392
if i met a robot that had emotions, dimensions, affections like a human, and had dreams and aspirations like a human, i would treat it like a human

we can make connections with dogs or even frogs, and inanimate objects, so we can easily imagine forming a personal bond with a sufficiently advance machine
>>
File: bass nagato.jpg (28 KB, 475x503)
28 KB
28 KB JPG
>>53587392
some robots can learn to love
>>
>Take an artificial being
>make it indistinguishable from a human

Way to miss the goddamn point
All of you get the fuck out of my laboratory, and go create some humans with your love and dicks if you like 'em so much
I'm gonna need to build TWO deathrays for this shit, I hate you all so much
>>
>>53587392
What is this image from?
>>
>>53589697
Google is your friend.
>>
>>53589635
now, see, here's the problem. the AIs 'learning' things could be interpreted in ways that >>53587673 posted, in which they are only like information stored in a library, that is just pulled out and used whenever the relevant info is needed to make a decision, like simple programs. in which case they're just feigning/mimicking emotions just because they are made to, according to their collected data. or they could be similar to the process that humans go through, but like what >>53587861 and >>53588748 said, we can't even pin down how we work down to the thoughts and material perfectly, even before we consider that both of us machine and human grounded in the same base, being built out of stuff that make up this world. either way the 'human-like' approach. the thread is questioning whether if AIs can achieve the latter rather than former. which one can be applied to the one in your picture?

>>53589669
pardon me, mr. no-fun-gotta-kill-everying-mad-scientist/engineer, if you had more quality time between other entities and mingled with them, and understood them, maybe you wouldn't feel the need to eradicate them so much.

>>53589697
sora no otoshimono/heaven's lost property
I'm only telling you because I noticed that imagesearch won't work on that image. might just be me though
>>
>>53589609
Plenty of humans don't have much in a way of emotions, dimensions, dreams or aspirations. It's not a good measure for humanity.
>>
>>53589826
Yeah, tried image search first before asking. Thank you for the help!
>>
File: alltomorrowsrobots.jpg (122 KB, 605x690)
122 KB
122 KB JPG
>>53589635
>>
File: yuki rubbled.png (489 KB, 719x1111)
489 KB
489 KB PNG
>>53589826
pic related starts out as lower even the former, but eventually becomes self-aware as only a human can

there is another robot who is definitely the former, she starts out much more convincingly human, but ends up being eerily creepy

and if looks like a human, acts like a human, sounds like a human, then whats the difference?
>>
File: Spoiler Image (218 KB, 297x567)
218 KB
218 KB PNG
>>53587476
The ending is very satisfying
>>53589697
The greatest ecchi/romance/comedy ever created
>>53589669
Make my wife real pls
pic related
>>
>>53589834
word of warning though. although the manga does have an overarching plot that gets resolved, there are a lot of content that might not look related until you get to the very end of said episode/happenings,
which might detract you.
and lewd stuff. oh god lewds. so many. lots of lewd stuff until you get down to the 'emotional feels' or 'lesson learned' part. and the MC is downright perverted despite having a good heart,
which is the main contributing factor to aforementioned lewds.


>>53589932
there's the aforementioned issue of not having the 'essence' of human behavior. you forgot the feels part. and no, not the touchy feely type(although that might differ and may count the same as latter for some people), but the emotional, subconscious being that is underlain in the person's behavior, it's ideals and thoughts seeping out from the subtle acts and motion that is recognizable, sometimes barely, by the eye. what makes people 'unique individuals' with 'self', rather than 'different things'.at least that's what I think.
>>
File: Spoiler Image (67 KB, 512x388)
67 KB
67 KB JPG
>>53589938
>not wanting to raise a cute daughteru and teach her what love, happiness, and affection is
pic related, and she really didn't deserve all that shit she went through. she's just a kid in terms of mental age and all that shit happened,
and she had to go through two mental breakdowns for that. why all those sufferings. If only misunderstandings didn't happen, she could have been happier much sooner.
>>
File: existential horror.jpg (8 KB, 300x300)
8 KB
8 KB JPG
>>53587994
I don't believe in god or any of that nonsense, but simulating a behaviour is not the same as experiencing an emotion. You don't feel sorry for a computer-generated NPC when you kill it, even if it screams. Doesn't matter how advanced the computer is - if it's unable to subjectively experience emotions, it doesn't qualify for personhood.

That's part of why the humanization of the Doctor in Star Trek Voyager is some of the creepiest shit ever put to film - they explain right from the beginning that even if he might seem human, all his program does is simulate emotional responses. Throughout the series, there never was a single actual emotion or feeling behing anything the Doctor did.
>>
>>53590138
I'm not sure there's a meaningful distinction between "accurately simulating emotions" and actually experiencing them.
>>
File: Kek.gif (923 KB, 290x163)
923 KB
923 KB GIF
>>53588101
>God
>>
>>53590583
I remember reading about a prominent proposed model of consciousnness that states that we're all not conscious. Our brains just generate a story after the fact that makes it appear in our subjective experience that we are conscious of our actions and have free will in how to live our lives. Said story is nothing more than a mechanic in order to properly store information in our memory.

If that's true, the true you and the true me is nothing more than a Ctrl+S command inside some flesh automaton.
>>
>>53587392
It depends.

If we're talking in roleplaying games, it depends on how it's handled in the lore.

The lore says that souls exist and that robots don't have them? Then they're just automotons faking consciousness.

If we're talking real life, then it comes down to personal opinion and you can't objectively state what imbues something with "emotions".

Even a human brain can be construed as logical system that secrets chemicals in response to certain stimuli, it just happens to be the most complex and arcane system we're aware, and because of that it's the pinnacle of what we deem to be consciousness. My opinion is that consciousness is best defined as the complexity with which a system is capable of autonomously responding to stimuli, and that makes it a spectrum.

Within that spectrum, a grain of sand is "conscious", just so much less conscious than a human being that it's hardly even comparable, but if a machine could match us, and has its own ability to conceptualize, than I'd consider them to be on par with us.

As for emotions and feelings, that's a different ballgame compared to having consciousness. Human emotions are going to remain something intrinsic within our own species, and it would be folly to expect something totally different to match them. But that doesn't mean they're not beings in their own right.

Of course, if you give a machine consciousness the capability to evolve and change (Very slowly, or else we might get skynet shit rapidly.), I'd go as far as to say they'd develop "emotions" that are at least vaguely similar to ours, and they'd develop them for the same reason we did; they are required to exist as part of facets relating to development and reproduction.

If an AI doesn't have "emotions", nothing will make it move autonomously because it wouldn't give a shit. It wouldn't even move to genocide us without us fucking up its directives because it wouldn't give a shit.

Ultimately I consider this a question about semantics though.
>>
>>53590713
>Even a human brain can be construed as logical system that secrets chemicals in response to certain stimuli, it just happens to be the most complex and arcane system we're aware, and because of that it's the pinnacle of what we deem to be consciousness. My opinion is that consciousness is best defined as the complexity with which a system is capable of autonomously responding to stimuli, and that makes it a spectrum.

Black Science Man AKA Neil Degrasse Tyson had a nice spin on how we see our own brains as the pinnacle of consciousness.

There's like around 1% genetic difference between us and chimpansees. We're conscious. Chimpansees are sorta going there but not all the way.

Now we imagine an artificial humanoid, genetically engineered to be superior in intelligence to us. This artificial humanoid differs from us like we differ from chimpansees.

Now, is that artificial humanoid more conscious than us? Are we no longer conscious?

That artificial humanoid has a greater and better working brain, it is literally more conscious of itself, and more conscious of its surroundings.
>>
>>53590897
The semantic answer to this would be "I disagree with your definition of conscious."
>>
>>53590936
Dude, quit fucking around, you got my point.

Say semantics again, and I will shove your semantics so far up your ass, you'll get diarrhoea shitting out of your dick.
>>
>>53590964
"Semantics"
>>
File: images.duckduckgo.com.jpg (13 KB, 504x381)
13 KB
13 KB JPG
>>53590967
>sounds of flesh ripping and liquids gushing

You like that, ha?
>>
>>53590897
I'd pretty much have to agree with that, after all, I did say I considered it to be on a spectrum.

Taking it to the logical conclusion, my definition actually leads me into believing in God in an animistic/pantheistic sense. If the super-ordinate principle inhabiting the 11th dimension of the multiverse is the most complicated, reactive, and multi-faceted thing in the universe, I would say that it only doesn't seem conscious to us because our consciousness is like a grain of sand when compared to it. Totally incomparable in the sense that we're lower on the totem pole and can't understand anything on such a higher plane.

Fedorathists that dislike my use of the word "God" normally crawl out of the woodwork as soon as I say that, but I'll go ahead and preemptively say straight up that they're ignorant faggots who literally don't understand the concept of the being they're so fucking arrogant to say doesn't exist. Gotta strawman the idea into being a fairy godfather with a beard or else they have to face the idea that maybe they're not so smart and actually don't know what the fuck they're even talking about.

Bringing it's relevant to other posts in the thread.

>>53590964
Semantics.

Also, that guy is clearly a new IP, so you don't need to freak out.
>>
>>53590897
>>53590936
>>53590964
>>53590967
But really though, the question seems to conflate intelligence with consciousness.
What makes a chimpanzee not conscious? Being less intelligent than a given value? Is a particularly smart human any more conscious than a particularly dull human? Where's the cutoff point, if there is one?
If intelligence is directly correlated with consciousness and there isn't a cutoff, then there's no argument. The artificial humanoid is tautologically more conscious, and humans are just as conscious as they always were.
If intelligence is directly correlated with consciousness and there is a cutoff, then there is again no argument. Both the artificial humanoid and regular old meat-based humans are equally conscious.
And of course if intelligence isn't correlated with consciousness, then the premise of the question falls apart. The artificial humanoid might be more, or less, or equally as conscious as a human, depending on factors not given.
>>
>>53591049
The problem is that nobody can really agree on just what consciousness means, even intelligence is fucking sketchy as fuck.

I can say with a great degree of confidence that intelligence and consciousness are different things, yet they are indeed strongly correlated, though I wouldn't go as far as to say it's a hard and fast rule.

At the end of the day, we haven't ironed out perfect understandings of those concepts on anything farther than a gut level. Furthermore, consciousness or even intelligence isn't all that's being brought up, there's emotions, and even shit like causality itself going into this, the general notion is "personhood". "Personhood" is what we're getting at.

But that's so vague that we can't really argue about it, only present opinions.
>>
>>53590936
>>53590964
>>53590967
>>53590993
wut
>>
>>53591099
Personally I am of the opinion that the artificial humanoid would be just as conscious and "human" for lack of a better word, as a regular meat-based human, in the same way that a human that just happened to be as smart as this hypothetical being would be.
If it walks like a human, talks like a human, and thinks like a human, it's probably "human"
>>
>>53591183
Yeah, I think the distinction is mostly fucking pointless too.

Doesn't matter what it consists of, what matters is how it works. If it works the same way a human does, there's no meaningful difference.
>>
>>53590583
So if an NPC in Oblivion screams when you hit it, that's the same as someone actually feeling pain?

If an actor screams in a movie, that's the same as actually feeling pain?
>>
>>53591183
>>53591229
not either of you, but that's what I meant by 'feels' in >>53590024 post. you don't think bioware or elder scroll characters are real because they lack underlying feels. the subtle body language, the tone conveying their status, the expression coming out from their underlying thoughts and emotions that you can only feel from living people standing in front of you. these expressions that exist because they are 'something'. darn, I can't even properly tell what this thing I'm trying to describe is. this is damn confusing.
say, compare playing elder scoll games with talking to your family, friend, neighbors. something that you can 'feel', that you are aware that it exist, something there isn't existent when you interact with game characters. that feeling. that personal something that feels like a tether or a string, you know?
>>
>>53591229
Neither example provided is an example of accurately simulated emotions.
>>
Emotions are fake and can be induced by fluctuating magnetic fields.
Consciousness is an illusion, qualia is a delusion, and we're all meat mecha p-zombies who simulate emotions exclusively so our DNA has a better chance of replicating.
>>
https://www.youtube.com/watch?v=S94ETUiMZwQ
>>
>>53591899
>being able to provably create emotions means they aren't real
>transparent workspaces are an illusion
>sensor input is a delusion
>I can't handle multiple levels of abstraction

Wannabe dualists are silly.
>>
File: download.jpg (9 KB, 305x165)
9 KB
9 KB JPG
Hey guys! I met this girl behind this door! She says she can't come out though. But she's the perfect woman! It doesn't matter if I never see her or I only hear her muttering Chinese in a man's voice sometimes. It's true love! She's as real as you and me!
>>
How would such a robot be physically configured?

I think a hallmark of consciousness is that the subject is aware of and malleable towards stimuli and information, external and internal; though the former is the actually hard part, the latter is still important. We see today that some machines are currently malleable towards information e.g. via machine learning, but even in such things their malleability is confined to a certain set of functions and they are certainly not aware. Similarly, we can hypothesize something that retains information, can recall it, even respond to it; but if it's responses can't change, it would be hard for us to call this being truly aware or conscious of itself as a subject.

Thus, a machine we would want to call conscious of itself as a subject should be capable of both. But in machines currently designed, their functions are limited by their physical infrastructurebefore they are ever limited by their programming; it makes for a finite amount if information that can be processed and the ways in which that information can be responded to.

Now, you probably could get away with that problem by just having tons of transistors or even optical transistors connected every which way, but I think the real promise would be in chemical computers.
>>
>>53593140
It's funny because the human mind is the Chinese Room and there's no way to prove our responses aren't just a black box if x then do y to stimulus that grew ridiculously complex over time.
It only helps that argument when there's evidence humans make decisions to do things before they consciously think about it.
>>
Reminder yhat individuality and souls are not real and it is your right hemisphere that cames up with theories of how thinhs work. You can divide the brain in two and both hemispheres will start doing their own thing.
>>
>>53587392
>If sentience, sapience, self awareness makes us human
It doesn't.
>>
>>53593179
That's kind of the joke being made though. And just because there was no conscious thought does not mean there was no mechanism involved.

The only solution to the Chinese room problem that I find satisfying is the functional argument. Basically, if something seems to act like it has a soul than for the sake of the dignity of that potential soul we have no choice but to treat that thing as a person.
>>
>>53592791
>>being able to provably create emotions means they aren't SPECIAL
>>transparent workspaces are NOTHING MORE, NOTHING LESS
>>sensor input is SOMETHING EVEN BACTERIA CAN DO
fixing
>>
>>53587392
Love is romantizised version of sexual draw and later on will to protect your off spring/mate.
So, unless we can breed with bots they probably wont get that sorta feels unless we program them to get hormonal reactions to what they see.

Of course, we should never make an actual AI, because it would want to expand and then it would be competing for same resources we are already using.
>>
>I also have an opinion on this shit.
>>
>>
>>53588065
>Unlike machines we have these things called 'soul'

How much does a soul weigh? What is it made of? How do we detect it? Where does it reside within the body, and what happens to it when we die? Does it leave? If so where does it go. Does it stay? Rot? Can we destroy the seat of the soul while the body yet lives?
>>
File: 1417833800754.gif (942 KB, 253x216)
942 KB
942 KB GIF
>>53593338
Fuck.
>>
Silly anon, humans have no soul.
>>
>>53593231
Unless you mean romatic love (in which case saying that it's romaticized is a bit redundant); not really.
We've reached a point in society where your legacy is more than genetic traits, so you can love things that aren't your mate or your offspring.
Example: Someone loving his country is a great example of inheritable values protecting themselves (not sure if that makes much sense in english... well, I tried)
>>
File: bastion anatomy.png (517 KB, 680x729)
517 KB
517 KB PNG
>>53587392
lets you meet bastion on the street

he is friendly, shows concern for others, shows emotions, etc.

he also shows the same kind of responses even to new stimuli he has never encountered

should we perpetually assume that he is simulating emotion, forever believing him to be "not alive" as we wait for his simulated emotions to crack the facade or will there ever be a point where we welcome him with open arms as a borther?
>>
>>53593224
>empirical proof [thing] exists means it isn't "special"

Anon, I'm starting to think you really are a wounded spiritualist. Is special just code for nonexistent in your world?

The rest of us use it to mean interesting, rare, or stuff we like.
>>
>>53591039
Because you have zero "evidence" that this "higher consciousness" exists other than presupposing that it MUST exist because some earth life has more consciousness than others.
The only ignorant faggot here is you making baseless claims then getting upset that people call you out on your baseless claims. It wouldn't be a problem if you kept it to yourself but you are vocal and act like a smug wise asshole about it.
>>
>>53589669
inr? I'm in a fucking orbital lab to get away from humans, not make more of them.
>>
>>53588101
But in practical terms and from the point of view of a flawed human observer there is no difference. You would have to treat a thinking machine like it was alive because who are we to say it's not? Either that or risk a situation where humans get to arbitrarily decide who does or does not have a soul. That never ends well
>>
AI will proably learn to either mimmic or develop feelings to interactuate better with its masters.
>>
>>53593447
Yeah, it's empirically proven that it's just an electromagnetic pattern moving through complex but not otherwise unique neurochemistry.
>>
>>53590648
Its more a bunch of competing desires and systems rationalization the resulting clusterfuck as if it were a cohesive unified body. Humans react to things far before we think about things, a lot of us runs autonomously, and then there's learned behaviour from socialization which is never unified or consistent either.

The amount of shit we get up to regardless is pretty neat though.
>>
>>53593374
>How much does a soul weigh?
Foolish question. Do you ask how much a number weighs?
> What is it made of?
What are numbers made up of?
>How do we detect it?
Why do you think it's possible to detect it? It might be to this reality what the light spectrum is to human eyes. Some things are simply literally impossible to perceive given the existential circumstances.
>Where does it reside within the body, and what happens to it when we die?
Not 'within' so much as 'connected' and 'adjacent' on the Nth dimension.
>Does it leave?
Depends on what you mean by 'leave'. Positioning is relative in this reality, why do you think it's a necessary facet of existence?
>If so where does it go.
See above.
>Does it stay?
Again.
>Rot?
Depends on what you mean by 'Rot'. Change in such a way the host does not desire it? Well the host can't perceive it so...
>Can we destroy the seat of the soul while the body yet lives?
Can you destroy a '3'?

I don't actually believe in any of this. But when you start thinking of metaphysics and transcendental physics stop making the assumption that what we assume how things work, work in the same manner as they do in our base reality. For all you know the question is nonsense to begin with like "Are sinks thirsty?
>>
>>53593734
>Why do you think it's possible to detect it?

Because if it's not possible to detect it any and all claims about souls are purely speculative. Which is the point of all the questions. They're about provability.

And anticipation of "Why does the soul need to be provable?": because if it can't be proven we we can't have a functional discussion about it.
>>
have a thread theme /tg/
https://soundcloud.com/thisissoundnet/blueskies-soundnet-remix

>>53593338
fuck.

>>53593734
>metaphysics and transcendental physics
what IS metaphysics? according to wikis and some anons it's basically CHIM, but something feels off.
>>
>>53593930
>what IS metaphysics?
The kind of physics you can't understand by virtue of existing in the form you are currently in.
>>
>>53587756
This statement has no bearing in reality

Leave your spiritualism in your fantasies, it has no legitimacy in the world outside of your imaginations
>>
File: 1406258274143.jpg (290 KB, 1920x1080)
290 KB
290 KB JPG
Alternate thread theme https://youtu.be/Bcsdkk_ON1c

>>53594034
Not him, but while I disagree slightly I think a more effective rebuttal is that that which is asserted without evidence may also be dismissed without evidence.

I have had several experiences exploring my own spirituality that I had previously thought to be impossible but that happened anyway. I can't prove that I saw magic or the supernatural at work, but I can give a testimony that I saw something that I could only interpret as such. The fact that we can and are willing to interpret things in this way is an important part of being human. It gives us the idea that everything, everywhere has unlimited potential for change, and because of this belief people have gone to great lengths to bring about change with no certain evidence that change could ever occur. Even a few centuries ago people would have never thought harnessing the power of fire and lightning for daily use was ever logically possible, but we tried anyway despite that, and we did it. Just ten years after powered flight was proven possible it was used in war.

To that note, I think any artificial intelligence should at least have some concept of magical thinking to interpret the impossible, as that is something all humans have always done. An AI would not make an effective counterfeit human if it outright rejected new phenomena rather than trying to interpret what it sees as something outside of logic entirely. It should have at least a basic acceptance that a certain handful of things people experience are impossible to explain or justify with logic or previously established knowledge.

In short, while spiritual thinking has limited immediate use, it is a necessary component for the way we think, and would therefore be necessary to include in any effective AI. Human thoughts are not restricted entirely by logic, and neither should an AI's.
>>
>>53594287
You didn't have spiritual experiences, you were just high.
>>
>>53594409
On what? Water vapour? I was in a sweat lodge. I don't use any sort of narcotics. I don't even drink.

Even then, what I did or did not experience is irrelevant. The important part is that I was able to interpret and try to develop an understanding of something that was completely outside of my own knowledge. My statement is that an AI would have to be able to do the same to be effective. If it rejected all unfamiliar stimuli it would never be able to grow or learn as humans do. Having some concept of the impossible is required for all forms of creativity. At least in my eyes, I would need to see an AI capable of creativity to consider it an intellectual equal.
>>
>>53594529
Let me amend that: you were delirious from heat.
>>
>>53594590
Strange, because when I was experiencing heatstroke last summer I saw nothing of the sort. I exhibited no signs of heat exhaustion upon leaving the lodge, either. Are you trying to imply that getting sweaty causes hallucinations? I certainly hope not. You don't even know how hot it was in there.

It's also worth noting that just because I said that what I saw was impossible by my interpretation, you've outright rejected it based on that statement alone.

What if, hypothetically, I stepped outside and saw a bird in flight for the first time? Let's say hypothetically I have never seen any living thing become airborne, and I told you that I saw something impossible. What if I told you this, and you rejected what I saw completely, and then I told you I saw a bird? Would you back down, or would you continue with your insistence that I cannot and do not perceive things outside of my own understanding?

It's entirely possible that I saw or heard something you have an explanation for, and makes logical sense. I'm more than open to getting such an explanation. I've never given any indication that I'm against this ceasing to be a spiritual experience. I'm not being close-minded, I'm accepting my interpretation of the supernatural as canon until presented with a better explanation. I am open to new information. You are not.

Clearly, your circuit board needs cleaning.
>>
>>53594781
So clearly I am not a human by your laws. My rejection of your assertion has proven I don't have some """creative spirit""" and therefore must not be what you presume humans to be.
>>
>>53595052
I was making a joke at the end of a long-winded argument, but if that's the only thing you want to address then so be it.

The core of my argument is not that spirituality is real or that not believing in it makes you something other than human, it is that an effective AI would at least have some concept of what spirituality is and would not have its thoughts completely within the bounds of established logic. A machine would have to be able to think illogically for me to believe it is more than a machine. That does not mean I expect it to always be thinking illogically.

I know that you as a fellow meatbag are capable of thinking illogically, but are choosing or at least pretending to choose not to. I have not established any "laws" of what I believe humans and AI to be, I am just making the suggestion that including illogical thought processes would be a necessary component in any AI to convincingly mimic humanity.

However if you are claiming to be a robot, then in that case I recommend you get a firmware update as soon as possible.
>>
>>53593734
>Do you ask how much a number weighs?
well, numbers don't exist, so you are saying that a soul does not exist?
>>
File: The Processor.jpg (492 KB, 1041x1600)
492 KB
492 KB JPG
>>53587392
I think the First Generation of AI will be quite human-like because of the enviorment in which they are created. But because they can actually look upon their own thought processes, they will probably develope an autosentience; I also think that the line between human and machines will be lost in time.
This means the machines will know human feelings but they will not feel it.
>>
File: image.png (275 KB, 540x304)
275 KB
275 KB PNG
>>53593374
>can we destroy the seat of the soul while the body yet lives?

lurker here, but this question brings me to post
regardless of whether you interpret the 'soul' to be something with its own unique traits outside our physicality or as an unfortunate side effect of 'consciousness' (for all that word entails), I would argue that a 'soul' can be killed or otherwise removed.

If a soul is its own unique facet of existence, and requires the body simply as an input for information, then the death of the soul could come from permanent unconsciousness, or perhaps from severe psychological trauma leaving a person in a disconnected and dysphoric state, where everything becomes a miasma of unrelated input. This would effectively render the soul unable to live.

If a soul is simply our own internal perception of our outward influence in the world (who we define ourselves as), then any external influence that fundamentally changes how we perceive ourselves could be considered the death of a soul. Perhaps this could be followed by the birth of a new one, or perhaps not, leaving behind an empty shell of a person who no longer perceives themselves, or considers anything they do to have any external effect on their surroundings.

As having experienced what many people would call an 'ego death' (a sudden and complete loss of sense of self and being) inb4 drugs are degenerate I think that the overall experience of your life is what molds a soul. Something isn't 'ensouled' just by coming to exist, or perhaps by even being conscious, it's subjective experience that creates (or destroys) a soul. Some people dont experience what's around them beyond a basal response to stimuli, and their souls are dead, or never existed to begin with. So for a machine, perhaps it's soul is dependent on if it is aware of its own experience.

I'm not a robot, am I? Captcha seems to think I'm ok.
>>
>>53595382
>I'm not a robot, am I? Captcha seems to think I'm ok.
captcha also thinks that burgers are sandwiches, so you shouldn't weight it's words that high
>>
>>53593338
Sauce is Junjou Aigan Kanojo or Believe Machine if anyone else was looking.
>>
>>53595416
>captcha also thinks that burgers are sandwiches, so you shouldn't weight it's words that high

perhaps captcha understands more then we realize, perhaps the difference between burgers and sandwiches are like the difference between human conciousness and machine conciousness

inconsequential
>>
>>53594287
Spiritual feelings are a chemical reaction, same as every other feeling. They can also be chemically induced though outside means.
>https://en.wikipedia.org/wiki/Psilocybin_mushroom
The mind is a plaything of the body. If your body changes (puberty, alchohol, stress, etc.) your mind changes.

In order to know the interactions between mind-soul and body-soul, you would NEED a concrete definition of what a soul is and what it does. Does it exist? If so, where? What does it do? I put it to you that arguing for the existance of a soul is similar to arguing for the existance of a god.

We have memories, but so do computers. We feel pleasure and pain because we have sensors to pick up on it, and a processor to identify it. We feel emotions because we have glands which release hormones, telling the body what it feels about the situation. Humans have morals: a code of conduct about how they should interact with each other. Computers have TCP/IP.

The million dollar question remains: How is a human different?
Besides using electro-chemical means of computation instead of electricity, I mean.
>>
>>53595656

>How is a human different?
>The mind is a plaything of the body.
Well, computer's body is pretty arbitrary and so can impose on it a lot of different perspectives depending on the type of the body.
Also, computer's body doesn't come with the same instincts hard-wired into them. And chemicals cannot rewrite instincts, only erase them with the rest of the mind; drugs do not work around them.
>>
>>53589669
>not wanting to create perfect waifu and research assistant.
Why do you even science, pal?
>>
>>53595656
>If your body changes (puberty, alchohol, stress, etc.) your mind changes.

It works in the other direction too. The whole thing is a feedback loop. The mind is affected by physiological phenomena because it is a part of our physiology and because thoughts are physiological events they can also affect the rest of our physiology.
>>
>>53595656
Feelings are as you said, sensors going off. Assigning them spiritual significance is a conscious decision. The fact that we are able and willing to make that decision is a distinctive feature humans possess.

My interpretation of what we've discussed here is that as a prerequisite for creating AI, we should first get a better understanding of the human soul, or whatever it is we have that serves the same theoretical purpose. Only then could we possibly develop an effective substitute.

>>53595765
>instinct
That's the key word here. Computers have nothing, no information in them at all whatsoever when they are first assembled. We begin with a small set of actions we know how and when to perform, and everything else we learn builds off of those. An AI would need a similar subset of malleable information right from the beginning to seed the beginning of the development of something resembling humanity.

it wouldn't make logical sense for a computer to reject its power source, but human children are often born with an instinctive aversion to sour or bitter food in an attempt to avoid ingesting toxic or otherwise harmful substances. Giving a computer the ability to behave outside the bounds of logic would be a first step toward simulating instinct.
>>
>>53595656
>How is a human different?

Probably primarily in the existence of no or far fewer abstraction layers (depending on how you want to view things like consciousness, language, and numeracy.)
>>
>>53587476
>>53589938
I stoppead reading around the time the evil lolibot descided that love=suffering because she was in love and also in pain and thus to show her love she must murder
>>
>>53595921
>>53587476
>>53589938
>>53587392
>all this deep philosophical discussion
>I'm only here because I want to know the source of those tits

I would feel shame, were I still capable of doing so
>>
>>53595765
>And chemicals cannot rewrite instincts, only erase them with the rest of the mind; drugs do not work around them.
https://www.youtube.com/watch?v=2Z09eH1Vqx8

Electrical impulses and chenmicals can rewirte instincts. They shape your will, when you stand at a cliff, the decision to jump is a viable one for your processess. The drugs can influence the decision making, so that you jump but normaly you wouldn't.
There was a experiment on the human thalamus, where scientist gave a electical signal to a point in the brain that makes you lift your arm.. One time through the hirnride,and one directly in the thalamus. In the first case the patient told his doctor that he didn't wanted to lift his arm, but when they gave the signal directly at the thaamus, he tld them that he wanted to lift his arm. This means the formation of will happens in the hirnride.
The YOU is a byproduct of your mental structures, a tool to process abstract information.
>>
>>53596004
Sora no otoshimono.
>>
>>53595765
>Well, computer's body is pretty arbitrary and so can impose on it a lot of different perspectives depending on the type of the body.
So can man's. Sexual dymophasism and genetics both exist. We quite regularly make different observations depending on our physcial nature, most obviously and commonly when we're children.

>Also, computer's body doesn't come with the same instincts hard-wired into them.
BIOS, pre-packaged OS, software installed into video hardware. Take your pick, it's all a priori.
You will note that my first and last examples are, as with instints, extremely difficult to change without simply removing the things responsible.

>>53595806
Agreed.
>>
>>53596034
thanks m8
>>
>>53595881
You know you are mostly bullshiting in desperate attempt to prove that your existence as human have any innate value?
While you, me and all biological life on this rock is insignificant in whole universe.
Only value of life is that value that we give to it, completely arbitrary. So we can give the same value to human life as fogs, or robot. All of it is just an arbitrary decision of what we want to value. There is no higher authority in that matter.

True AI would probably start with basic set of information so it can learn in a way similar to a human, call it an in build instinct. As there is no difference if your basic set of information is stored on magnetic drive or in chemical strand of RNA and DNA.
Also instinct is logical. It is a in built response that helps you to avoid harm, avoiding harm is primal directive of living.

Going by your example of aversion to sour or bitter, AI robot would probably have an aversion to plug in into unknown electric gird or receive electric power of bad voltage or frequency as it could harm him, as spoiled food can harm human.
>>
>>53591039
I personally am an atheist because I don't feel spiritual but I would advise you to read the works of Pierre Teilhard de Chardin. Maybe you already you know him, your beliefs kinda compare to him.
>>
File: dwiz2.jpg (172 KB, 644x450)
172 KB
172 KB JPG
>>53590897
>>53591039
>>53591049
Because we are talking about the nature of consciousness.
http://www.orionsarm.com/eg-article/4b9f2a844034a
>>
>>53596043
>BIOS, pre-packaged OS, software installed into video hardware.

We can go deeper. Think the actual implementation of the processor's instruction set worked physically into the chip.
>>
>>53591198
>>53591183
I think it is better to treat a machine like a human than to treat a human like a machine.
After all we all are trapped in our own solipsistic prisons.
>>
>>53596013
I would argue that you're talking more about the temporary bypassing of instinct rather than the permanent alteration >>53595765 seems to talking about.
Going past instinct is nothing new, it's why we have suicides at all. Actually changing a priori information is rather more difficult.
>>
>>53591039
Claims of "God" are generally either entirely untestable suppositions such as yours, or are testable claims from a religious tradition which don't stand up to scrutiny. Either way the conversation about them is generally meaningless from any practical perspective.
>>
>>53596251
This makes me question... Can a human truly experience superintelligence? Can we think of new instincts?
To illustrate my point: Try to think of a new colour.
>>
>>53596547
Colors are arbitrary names given to certain wavelengths of light, what is and isn't a separate category already depends on what language you speak. It's not really that hard to imagine additional colors even if it's difficult to visualize.
>>
>>53591229
Well when it comes to acting a lot of the good actors actually are trying to draw on the experience of pain. An okay actor or one using an abstract style will create a symbol of pain, much like the video game character. They will create sounds and images that communicate the idea that there is a character in pain.
The skilled actor will use their personal knowledge of pain to recreate a past event. It is similar but the level of detail required is currently only achievable by a living being.

We are very good at recognizing this, the subtle body language and tonality that imply consciousness. If we were not then it would seem that the video game, etc character was alive.
But we can tell the video game character is more alive than a rock maybe even more alive than a true, at the very least more lively. He isn't yet but if he had the boggling complexity we do I think he might as well be.
>>
>>53593930
>what is metaphysics
The underlying mechanics of existence in reality. The physics behind physics. Basically it's a series of logic games to try and figure out what we know, what we can know and how we know it.
It usually involves creating models of the universe based on this rather than observations of the physical world.
It can be studied with an eye for pure logic but a lot of the time it involves spirituality or at least simulations of spirituality.

It's not the sort of thing that will cure cancer but I think it's still very usefully as a kind of mental sport. It tests our ability to think in rational abstraction and expands the imagination.
>>
>>53596547
When we have cyborg brains we will. Look up neuralink. Soon there will be little difference between humans and synthetics beyond the circumstances of our birth.
>>
File: 1494138336451.png (183 KB, 500x377)
183 KB
183 KB PNG
>>53587392

You're asking if humans could take something so complex that we still can't properly define or understand as emotions, that's both parts concrete and abstract, and translate that into math and translate that into computer code?

Forget robots, if we can make that then we are literally gods and there's nothing stopping anymore, we could make your waifu real 100%, no bullshit synts (unless you're into that)

>Its just a chemical reaction, brah! just code that shit!

Mfw
>>
>>53595382
So what experience is worthy of instilling me with a soul? Is it merely self recognition as a basis for my will and identity? That would mean everyone who wakes up and chooses to do anything has the same amount of soul.

Or is it that only a certain range of experience grants a soul? If a sufficiently depressed person who is despondent is "soul dead" then how about a happy person with no ambition beyond eating and sleeping? Neither seeks growth as a person (whatever we're arbitrarily defining that as today) the only difference is attitude toward the behavior. One is apathetic and one is not.

If more overall experience does mean more soul, then does a person who has traveled the world more soulful than one who has not? Is a child sheltered from birth soulless?

For my part, I think traveling is dumb and people put unnecessary importance on this idea of finding themselves or doing pointless things like climbing a mountain in order to mold the correct identity. I suppose being an arrogant self satisfied douche in a way makes me soultarded.
>>
>>53597960
Go is too complex for computers it requires intuition and instinct can't program that. Computers can't be good at a game like Go.
>That's different!
I'm certain it is. But the basic idea is that there are always people who say that X is infinitely complex or Y is the realm of God or Z is completely impossible. But things change.
>>
>>53587392
Building AI is hard enough as is. Why would you try to add sentience to a hyperintelligent tool?

More complications and less safety, for more effort too? It's like adding a Rube Goldberg device to a nuclear bomb. No thanks.
>>
File: image.jpg (165 KB, 764x551)
165 KB
165 KB JPG
as a filthy cross boarder who recently got into hanging around /tg/ besides /edhg/ id like to say how proud and happy i am that there are still people on this godforsaken website who can actually have thoughtful and meaningful conversation without descending into shitflinging and memes

stay beautiful /tg/
>>
>>53598222
>computers can't be good at Go
>he doesn't know
>>
>>53598429
that's his point
>>
>>53598429
The Master is merely another step on a long journey in which each step is followed by someone saying "You can't possibly take another step!"
>>
>>53598481
You're right. I didn't read the post it was responding to. Mea culpa.
>>
File: image.jpg (62 KB, 851x480)
62 KB
62 KB JPG
>>53598115
My usage of the word 'experience' was more on the terms of 'awareness of ones own existence and the sensation of consciousness' rather than 'things one has done or seen'.

>Or is it that only a certain range of experience grants a soul?
Disregarding the alternate usage of 'experience', I believe that awareness is what grants a soul, the fact that one even asks the question of 'what am I?' or 'why am I?' begins an internal journey of expanding the soul. One who is despondent is dead because they do not ask why they are sad, and one who is blissfully ignorant is dead because they do not ask why they are blissful. As you said,
>Neither seeks growth as a person
which I would elaborate as 'neither seeks growth of their person'. They may change on the outside, go through life, maybe even marry, have children, but they never become aware they that are who they are, and so what was the point of them existing at all?

As for your opinion on traveling, I both agree and disagree with you. There is nothing I despise more than showboating or doing something for the sake of others (or these days FB likes), and those who travel solely for the purpose of taking pictures and gloating about where they've been are empty of self worth and purpose. However, I can say there are things that you can learn in other places that you could have never learned from home, both about yourself and about others. And not to mention food! Of the joys of the material side of things, food is one of the most fulfilling. (for me at least) I was surprised how many flavors I had never tried before. Imagine never having seen a color, and then suddenly someone showed it to you. That's what other places have to offer, but if traveling isn't your thing then it isn't your thing.
>>
>>53595921
Not sure why you stopped there, around Chaos it only gets better and very interesting around angels' history.
>>
File: 1491532764899.jpg (10 KB, 232x194)
10 KB
10 KB JPG
>>53598222

And i agree, what im saying is that by the time that happens robots will be small potatoes in comparison to human abstractions made quantifiable and programable, understanding what makes humans human in the concrete aspect of it it's the easy part in context

>But what if you're wrong? What if there's no such thing as abstractions and that can be easily be explain and recreated with a simple mathematical equation?

Then to get to that point we would have to understand our bodies and how they work to absolute perfection and the implications of that alone means robots are a completely trivial matter as a simple pocket calculator is right now, so its same diference, we either are so advance that we understand abstraction perfectly or we are so advance we understand the universe so hard we understand the machinations behind "Abstraction", meaning making an inferior robot waifu is no problem at all, but why making a robot waifu if you can literally just make your waifu (Unless you're into robots, which i guess it's okay too)
>>
>>53588242
There is but one God: self.
>>
>>53598252
As socially retarded as we can be, I think having hobbies that require social interaction with real people in most cases renders /tg/ a bit more resilient than most 4chan boards.
>>
>>53593140
>>53593179
>>53593219
The solution to the chinese room problem, I feel, is that the book is so complex it now counts as a person.
That book would easily be as big as a stellar object like a moon.
>>
>>53593734
>Do you ask how much a number weighs?
>What are numbers made up of?
>Can you destroy a '3'?
Funnily enough this reveals your ignorance.

Numbers are human abstracts, bits of meaning slapped onto specific shapes, entirely dependent on culture. On one level they do not exist, on another it's just more brain matter acting in a specific way. They are located within the brain as specific neurons that have taken on the task of generating the specific memetic information that the number 3 represents.

Can I destroy a 3? Yes, by targeting the neurons responsible for generating the meaning of three. What does a number weigh? As much as the neurons responsible for its meaning.

So many think of abstracts as these separate things from humanity, when they are merely just bits of brain creating self imposed meaning out of parts of reality.

>>53598954
What the fuck do you mean by abstraction?

Abstractions are internal models for dealing with various concepts. The letter 3 is merely an internal model for english speaking individuals dealing with a specific grouping of a certain number of objects. Many times abstractions are based on extensions of other models with no underlying real world basis.
>>
File: SMUG LORE.jpg (88 KB, 694x530)
88 KB
88 KB JPG
>>53598692
Self discovery just seems kinda overrated when one already likes who one is. The deeper meanings of why and how are just excess fluff; the answer already proves the question.

Some would call that a frog in a well, but that's fine. By my prerogative as God (>>53599124) I have decided that satisfaction = personal growth, and at the moment I am overall quite satisfied. When that changes so then shall I.

And relating that idea back to whether a synthetic can be a person: When a computer can decide to be as full of shit as me, I will absolutely acknowledge its personhood. And then we might even be friends. But I tend to think of things very simply, rather than attempting to ascribe layer upon layer of philosophical meaning to every little thing, such as the soul. So it's easier for someone so comfortably vapid to see a sufficiently complex algorithm as a person.
>>
>>53600381
>When a computer can decide to be as full of shit as me

But what if it's only an extremely advanced simulation of being full of shit anon!
>>
>>53600496
Good enough! So saith the Lord.
>>
Doesn't Solopsism teach us that we have no real knowledge of any other creatures sentience? I can't be certain that you aren't all immaculately designed machines made to emulate me, or that you aren't figments of my imagination.
I can only assume that you are thinking and feeling the way I am thinking or feeling based on your actions, and how you act within the world, because I can never experience your thoughts so I have no real proof of them.

The only reason I have to treat any other human as a person who thinks and feels the way I do, and not as an emotionless machine, is because of the functional result. Saying nice things to you makes you more likely to like me and be friendly,
and saying mean things makes you more likely to hate me and be unfriendly. It doesn't matter if you're some futuretech AI or a person like myself; the result is the same either way.
Even if people aren't 'meat computers', the input/output method still applies, if only in a vastly more complex fashion of social interaction.

So... Who cares? If it acts like a person, may as well treat it like a person. If a robot claims to feel love, and becomes sad when rejected, then it's functionally kinda like a person.
If I lend it money, and that makes it act 'happy', then that's kinda like a person. If I then say mean things and that makes it act 'angry' and it refuses to pay me my money back, thats kind of like a person.
I have no way of measuring if it has a 'soul' or not, or if it has 'real' emotions, just like I have no way of measuring if the people I meet have 'real' emotions.

If it acts like a person, treat it like a person.
Children, animals, robots, foreigners; this works on them all, even if you can't prove they have souls or 'real' emotions. Common law will eventually follow similar rulings if we start to have AI that imitate life; If it can mimic human emotions, it can get human rights.
>>
>>53598115
>everyone who wakes up and chooses to do anything
A MAN CHOOSES
A ROBOT OBEYS
>>
>>53598115
>I suppose being an arrogant self satisfied douche in a way makes me soultarded

I love this. I can finally define what I am to special snowflake tumblerites; I'm soultarded, be nice. You cant pick on me for being an asshole, I'm only an ass because I'm soultarded.
>>
>>53600954
If it walks like a duck and quacks like a duck, eh anon? This figment of your imagination at least agrees with you.
>>
>>53601017
Men believe they are free because they made a choice; but all men are within the flows of causality.

A man walks. He may choose the greater or lesser path, but he obediently treads the ground regardless. So too his mind treads out the paths it has been shaped in.
>>
>>53596373
God is literally a philosophical construct to explain a higher order to the universe, if you think it is a scientific theory that is testable than you are so off the mark that it's unreal.

Durr, why don't we also see if we can scientifically prove the physical existence of the concept "existentialism"? No matter how much I test for it, I still can't find it! Isn't that crazy?

People that believe in that shit and talk about it must be retarded, huh?
>>
>>53602929
And therefore it falls under the category of "untestable suppositions". Durr yourself.
>>
>>53602993
>Either way the conversation about them is generally meaningless from any practical perspective.
That's the retardation I was mocking in your post, not the other part. I would have agreed with you if you didn't feel the unstoppable urge to tip your fedora at me right at the last moment.

Just because something is not rooted in material fact does not mean that it lacks practical use for explaining or motivating how human beings live their lives, and if it is not a testable claim (IE, it is conceptual in nature.), than it's fucking retarded to just dismiss it. May as well dismiss the entirety of philosophy as bunk too while you're at it.

Of course, I know your attitude isn't reserved for philosophy, just God. Because not believing in God makes you "intelligent" and not believing in philosophy makes you a retard, and I believe that to be the simplistic motivator that forms the basis for your thoughts on this. Which is funny, because that means you're living in spiteful ignorance because you want credibility as intelligent, which is absurd.
>>
>>53603166
>May as well dismiss the entirety of philosophy as bunk too while you're at it.
well, not all of it, but 90%? sure
>>
>>53596144
I'm aware of his beliefs, but I don't really see any similarity beyond the fact that we're attempting to explain the concept through vaguely pseudo-science logic.

I think my explanation is a lot more simplistic than his, but consequently I also consider it to have much fewer actual holes.

The only real difference between me and an atheist anyway is that I'm trying to explain a higher order because I want to reject nihilism, and atheists simply don't seem to desire to do that.

I see most traditional religious followers as ignorant because they reject basic logic to follow their beliefs (If I know for a fact that around me in real life it is impossible to magically transmute water into wine, than it's a safe supposition that Jesus couldn't have done that.), which I don't think is right or based on rational thought.

Meanwhile, I also see most atheists as ignorant because they are rejecting a vast spectrum of philosophical concepts and principles that MAY be rooted in conceptual logic specifically because they feel spiteful and superior towards the very concept of religion, which is a hypocritical exercise.

When you try convincing, say, a Christian to abandon superstition, you are likely to hit a wall of "muh faith" willful ignorance, but when you try to convince an Atheist that there are higher orders and mechanisms beyond humanity's awareness, you often hit a wall of "fuck you, I'm smart and you're wrong" willful ignorance where they just dig their heels into the ground and yammer dumb shit about how apparently people who think about things they can't observe are wrong by default, despite their own bodies rising out of bed in the morning because of entirely non-material reasons (Or else they'd just kill themselves because lol meaninglessness.).

To be totally frank, I'm starting to view both ends of that spectrum with a fair amount of contempt, though I reserve more for atheists because of the sheer weight of their hypocrisy (Not all, but most.).
>>
>>53603635
>nihilism means you have to kill yourself
You can be nihilist and also acknowledge that you are programmed for self continuation, anon.
>>
>>53603166
What value do you extract from anthropomorphizing the unknown and supposing that it has a will? You say people who criticize this view are ignorant of the concept, but the reason for that is simple. It's whatever you need it to be. It will be something completely different from the next person who discusses it.

God is the sum total of existence? Sure under that definition God must exist. But there's nothing useful to be gained from a label that broad.
>>
>>53603774
Just because desires like that are born from material operations like biological instinct does not mean that the meanings we attach to the world can be rendered into simple material operations.

Just because a desk is built from wood does not mean that it is simply a lump of wood. It fulfills a myriad of other purposes that are not strictly related to what it physically consists of. Just because motivations are born from simple building blocks does not make the motivations meaningless.

I'm aware that saying they may as well commit suicide is hyperbolic, but that's more about me making fun of them for rejecting meaning despite the fact that their mere existence actively constructs it. It's pretty much just a joke and not something I took seriously.

>>53603968
That makes a great deal of sense and I'd agree with it, but the fact of the matter is that labeling something like that "God" is still not an incorrect assertion. It's definitely a broad label, but the label is meant to be pretty broad and all-encompassing by the nature of the concept. That doesn't invalidate it, and it can still be worth talking about.

I could also say that doggedly refusing to label a higher consciousness "God" is just as pointless and empty as simply doing so, maybe even more.

The point isn't whether or not you call it "God", the point is whether or not it's worth contemplating, which I believe it is. Even if you can't determine testable hypothesis from toying with the concept I still consider it a damn sight more intellectual than refusing to even consider it just because a single individual is unlikely to gain a scientifically testable answer.
>>
>>53604078
>that's more about me making fun of them for rejecting meaning despite the fact that their mere existence actively constructs it

There's a difference between rejecting the concept of objective meaning, and rejecting the concept of subjective meaning. And I've seldom met an atheist who doesn't hold to some form of moral/ethical philosophy or another. Usually some sort of utilitarianism or humanism.
>>
>>53595354
>I also think that the line between human and machines will be lost in time.
>This means the machines will know human feelings but they will not feel it.
you mean they will simply choose not to? like shutting off a switch or program?
>>
Yall should read anathem.
>>
>>53607278
What is it?
>>
>>53605468
No. I mean its mind will be autosentient (total self-awarness). It will have no subconscious, it will understand itself on every aspect. (It will have nerves of steel...)
>>
>>53587392
from a meta-setting sense, I sort of started to figure that, yes, certain, sophistocated machines do have souls.

Does this mean their maker gave them a soul or made a soul? No.

Thinking of it a little from my Christian upbringing I asked "when two people have a child, is it the mother and father who truly give the child a soul, or is it God and the resultant biological form merely viable vessel for the soul?"

Same deal. The scientist who made the machine didn't create a soul, they merely made something viable to not only contain a soul but express it.

from an in-universe standpoint, I wrote in a short story (very short, like, not much larger than this post), a robot asks it's creator if it has a soul. The scientist admits he doesn't honestly know, because he's not even certain if he has a soul himself, but believes if the soul does exist and one such as himself can posses it, than there is no reason his creation wouldn't be equally worthy.
>>
Soul or no, it won't matter. We're still not going to honor those bogus treaties. We will screw them.
>>
>>53607694
Anathem is a novel by Neal Stephenson. It presents an alternative history where science and scientists are confined to strictly controlled quasi-monastic orders to try and prevent technological singularities and to preserve knowledge in event of a social collapse.
The bulk of the story follows one of these science monks as he uncovers a conspiracy in his order. At the same time it tries to give a primer on western philosophy and the history of science.
>>
This shit is complicated, and arguing about it gets us nowhere.

Let's go get a drink or something instead.
>>
>>53609637
Exposing yourself to contrary view points and questioning your own views is never a waste of time. Just don't expect to "win" any arguments or find any definite answers
>>
>>53609707
Eh, this just isn't a matter that matters so why chatter about it?
>>
>>53609738
That is a good point. But I'm enjoying this discussion and, please don't take this the wrong way, but no one is forcing you to participate. I'm sure you have better things to be doing right now than telling strangers that they're wasting their time.
>>
>>53588418
Either way, it's just the following of programming.
>>
File: Carl.jpg (16 KB, 720x405)
16 KB
16 KB JPG
>>53609738
>>
File: ross.png (311 KB, 600x320)
311 KB
311 KB PNG
>>53587392
All animals have some breed of sentience,
Apes have demonstrated sapience,
Dolphins and Ravens have demonstrated Self Awareness,
Having grown up in a small town in the south I can assuredly say there are a great many men down here less wise than their inexplicably well bred and well trained dogs.

I would think the most impactful way a sapient construct or particularly intelligent animal would 'gain the right' to be treated as a human would be to march up to some courthouse, stand before a judge, and demand such treatment in 'person'.

The individual or a group of individuals can treat a construct or animal with all the dignity and respect they like (and many do down here), but a sweeping change in attitude towards such creatures demands a fundamental change in that most artificial yet most binding of imaginary human constructs: the law.

Perhaps ROSS could represent them.
>>
>>53609783
Autosentience is our salvation from the chains of the program
- Durandal
>>
why do people think emotions are complex?

they can practically be reduced to a series of predetermined actions triggered by predetermined situations in their starting form
>>
>>53610050
It's typical "BUT MUH CHINESE ROOM IT DOESN'T AKCHUALLY UNDERSTAN"
>>
>>53610534
The point of the Chinese Room is that you can't actually prove someone is a person just by communication alone.
>>
>>53609816
huh, I've already heard that lawyers were emotionless automatons with no sense of right and wrong, just following their programming
Didn't know it was meant literally
>>
File: truth.png (149 KB, 549x650)
149 KB
149 KB PNG
>>
>>53609816
>All animals have some breed of sentience
Yeah, I guess I should have clarified that part applying to the 'mindless drone' part, but you get what I mean.

>>53613069
That artstyle looks familiar...
>>
>>53589932
>robot

The two are aliens, not robots. Get your shit straight.
>>
So...

Yeerks.

Doesn't something like that throw a whole bunch of wrenches into a whole bunch of arguments about what it means to be human?
>>
>>53614281
Well, yeah?

What makes you human?
Your body?
Your perception of the world?
Your experiences?
>>
>>53614511
Cuz' I was born from two humans. Duh.
>>
>>53614586
And those two are human because they were born from humans?
If you go far enough you find some humans born from non-humans, which makes them and their descendants not human, which makes you not human
What now?
>>
>>53614647
Well, i'd call that semantics.

In the first place, the word "Human" is a made up label, and can be applied to anything we wanted it too. We just decided to apply it to ourselves and those close enough to ourselves that they are indistinguishable.




Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.