The debate "People would work exactly the same with physical mechanisms without need for subjective sensations" was started by
January 6, 2020, 2:31 am.
8 people are on the agree side of this discussion, while 10 people are on the disagree side.
People are starting to choose their side.
It looks like most people are against to this statement.
diecinueve posted 3 arguments, Allirix posted 1 argument to the agreers part.
Nemiroff posted 1 argument to the disagreers part.
diecinueve, ashok, Allirix and 5 visitors agree.
Tyrice12, Nemiroff and 8 visitors disagree.
I guess my main point was if consciousness and subjective experiences are just features of a specific type of complex neural network then they're almost certainly not features of every complex neural network. Maybe after a specific scale, complexity, and redundancy level consciousness always emerges, then you're right because this means to be a general decision maker at our level you'll need consciousness. But I don't think that's the case and there's no way (I can think of) to prove it one way or the other.
i think you are misrepresenting my position. i never said the mind is non physical, metaphysical, or any form of supernatural. the title of this thread states that we do not need subjective sensations. you described how those sensations arise, which does not negate the need for them.
for the record, the use of word sensation is wrong, technically, the opening title should say we do not need subjective perceptions. sensation is the objective interaction of our senses with the outside world, perception is what our brain turns those sensations into.
i gave the example of insects and bacteria which likely lack perception, and act like the ai of gta today, rudimentary scripted reactions. we have what is analogous to your monster neural network, which gives rise to our perception, and elevates us above a handful of scripted reflexes. without the perception given to us by something like your neural network, we would be simple GTAman ai, and be very different from who we are.
thus i will respond to the title by saying without our subjective perceptions we would not work the same.
That is why if we did not have conscience and only had the structure of the neural network, we would be like an AI and, therefore, we could work exactly the same
The subjective window is just a structured and filtered way to throw complex information at consciousness. Consciousness is just a complex structure of the neural network that contains a large part of the learning algorithm. Both work together to decide how to react optimally to their environment now and in the future. Another process could throw the same structured and filtered information at an unconscious learning algorithm and reach the same decision.
AI can already take image information as inputs, filter and structure the data across several convolutional layers, throw that structured and filtered data at support vector machines, and make the same diagnosises as conscious experts in several medical imaging fields as an example.
Okay I think I understand your position better now, but I still disagree that there's a functional difference between the mind (sentience,consciousness,etc) and the structures that facilitate it
Yes, GTA man, with preprogrammed reactions, doesn't have a subjective window interpreting his world, but an algorithm is believed to simulate this window.
If you don't think this algorithm exists, you're not alone. We don't have the computational power to even come close to testing it yet, but it's been designed and is theorised to be conscious when fully simulated. The easiest way to think of it is a 5-input super connected neural network with tens of billions of neurons, unknowable levels of redundancy, with an asynchronous learning algorithm (defined by the current iteration of the network) that trains multiple weights per neuron as well as the actual mathematical functions connecting each node that use those weights.
Consciousness is believed to emerge from distributing this self-referential learning algorithm (where the current iteration determines the next iteration) across the stupid levels of redundancy in this massive super connected network. Some of the nodal structures may be strongly linked to the distributed learning algorithm and be noticed by it, like the inputs and key operators defining neural connections, and some basic processes may become weakly linked and effectively automatic and unconscious by the nature of optimisation. There's far more to it but that's my best explanation.
Would the gta man programmed with this still have superficial sentience? It's at least a higher order sentience than an insect as we can simulate their brain and they don't have a powerful self-referencing learning algorithm linked to senses and internal stru7ctures.
So just like us, GTA man's experience is defined by whatever is strongly linked to his learning process. For us that's the inputs (senses) and the complex combinations of operators interacting with each other between neurons (chemicals and connecting structures in our brain). Happiness, for example, is just one of the operators that define the mathematical function between neurons that the learning algorithm uses to increase the weight of a neuron when it's responsible for a valuable result. His goals, dreams, and motivations are just structures in the network that store optimal paths to whatever he was trained to value. Our values come from culture, his values may come from initial conditions.
the distinction i make is easily seen when comparing high order mammals to insects or bacteria. all things react to stimuli, including pain, but that doesnt mean they have a running perception of their sensations; a stream of consciousness.
although our individual conscious experiences are subjective the existence of those conscious experiences is objective. im not sure at what order of development it arose, but humans, several mammals, and a few select other candidates almost certainly have it. most life however doesnt. a spider's perfect web doesnt come from an understanding of engenieering or aerodynamics but base instinct implanted over milennia. they will never learn another pattern no matter how simpler then their original.
the question i ask is of consciousness/sentience. spirituality is simply an attempt at answering that question. like the origin of life. it is by far not the only answer.
unless you believe spiders and bacteria have a running monologue like we do, then the distinction should be clear. sorry for confusing you with the original poster tho. my bad.
I brought up the issue of mind-body dualism language in this sort of debate because I've seen its usage in neurological studies because it's how we comprehend subjective experiences. It's been heavily criticised there because the mind is no more the brain that houses it, yet the language separates the two. So sentience and the chemical processes that it emerges from are the same. We intuit a difference because the mind is only wired to the senses, which don't live in the brain so they're blind to it, and feelings, which are sensed at the same emergent level as consciousness and equally blind to the actual mechanisms of the brain.
The only way sentience and the processes it emerges from can be different is if there is something more, like a spirit. Thus mind body dualism describes what you're implying.
the gta person will never break out from a limited series of loops.
we do have automated functions, instincts, where our actions occur before our thoughts. however those are very limited, as are our possible reactions to them just like the gta npcs. mechanical, instinctual reactions will never have dreams or goals. no motivation or happiness. it would seem the same on a very superficial level, but it would not be the same at all.
Subjective sensations are caused by mechanisms in our brain, although we did not have those sensations, we would continue to have those mechanisms and therefore act exactly the same.
A character in a video game cannot feel pain or have feelings because they are only data on a computer, but if I hit someone in GTA, that person will act as if he had bothered without the need to have subjective sensations
i was not speaking of soul vs body, but sentience vs chemical reactivity. your original framework was about sensation vs mechanics afterall.
subjectivity is not synonymous with spirituality.
The mind-body duality your proposing has no evidence in modern science. To clarify, there's no evidence the mind is a non-physical quantity. It's just a product of the physical brain and body. There's no evidence of a special soul or supernatural phenomena. That means the experiences of the mind are just chemicals reacting to stimulus. They're all physical mechanisms, so yes we would work the same way of we were just physical mechanisms because that's all we are.
we would work, but would we ask questions and develop the society of today without happiness?
how can you have drive, motivation, or any form of dream, without a notion of a happiness upon achieving that dream?
the human body would work. but it would be an animal only focusing on today. it is subjective sensations that allowed us to dominate this planet, on our scale.
People would work exactly the same with only physical mechanisms in the brain without the need for subjective sensations (such as pain, happiness, colors, etc.)