Sentience and brain size

"The question is not, Can they reason? nor, Can they talk? but, Can they suffer?" - Jeremy Bentham
User avatar
Brian Tomasik
Posts: 1107
Joined: Tue Oct 28, 2008 3:10 am
Location: USA
Contact:

Re: Sentience and brain size

Postby Brian Tomasik » Sun Dec 02, 2012 11:42 pm

Pablo Stafforini wrote:Of course, you could argue that we can recognize that consciousness exists, and then claim that consciousness is reducible. Yet Chalmers's position is that no reductive explanation of consciousness can be given (he supports this claim with a battery of sophisticated arguments).

Yeah, I think consciousness is reducible, but I haven't read Chalmers's battery of arguments.

Pablo Stafforini wrote:Just to be picky: I don't think there is any "ethical" meaning of 'consciousness'; there is no sense in which 'consciousness' means "what we care about ethically".

Haha, maybe I'm the only one who uses it like that. ;) I agree it's not a good way to employ terms, but I haven't invented a shorter alternative for "the kinds of mental operations that I care about."

User avatar
Arepo
Site Admin
Posts: 1097
Joined: Sun Oct 05, 2008 10:49 am

Re: Sentience and brain size

Postby Arepo » Mon Dec 03, 2012 12:50 pm

Brian Tomasik wrote:Haha, maybe I'm the only one who uses it like that. ;) I agree it's not a good way to employ terms, but I haven't invented a shorter alternative for "the kinds of mental operations that I care about."


I have a lot of sympathy for that view. It seems at least plausible that the only distinguishing features of what we think of as consciousness are also the distinguishing features as what we think of as emotion. I'd like to follow that path more at some stage, but most people I've said it to have dismissed it outright as seeming too implausible, and I don't really have much of argument for it except Occam's Razor - two mysteriously significant invisible processes in the same physical location is a priori less likely than one.
"These were my only good shoes."
"You ought to have put on an old pair, if you wished to go a-diving," said Professor Graham, who had not studied moral philosophy in vain.

User avatar
Brian Tomasik
Posts: 1107
Joined: Tue Oct 28, 2008 3:10 am
Location: USA
Contact:

Re: Sentience and brain size

Postby Brian Tomasik » Mon Dec 03, 2012 1:09 pm

Arepo wrote:It seems at least plausible that the only distinguishing features of what we think of as consciousness are also the distinguishing features as what we think of as emotion.

Cool.

The ethical definition of "consciousness" is akin to the functional definition of "tableness." The most practical question about consciousness for most of us is, "Do we care about what looks like suffering by this organism?" For tables, the practical question is, "Is this something I want to put my dinner plates on?" One could come up with alternate requirements for tableness -- e.g., that it must have at least four legs -- and those would make sense too, but they're less relevant to the practical question I'm answering. They may also not be totally irrelevant. Maybe I'm more inclined to put plates on things that have at least four legs.

User avatar
Arepo
Site Admin
Posts: 1097
Joined: Sun Oct 05, 2008 10:49 am

Re: Sentience and brain size

Postby Arepo » Mon Dec 03, 2012 5:32 pm

Right, but it might be of interest if you could prove that a plate can sit on things iff they're four-legged wooden constructions. Similarly here (though I think my equivalence more likely :P).
"These were my only good shoes."
"You ought to have put on an old pair, if you wished to go a-diving," said Professor Graham, who had not studied moral philosophy in vain.

User avatar
Darklight
Posts: 118
Joined: Wed Feb 13, 2013 9:13 pm
Location: Canada

Re: Sentience and brain size

Postby Darklight » Fri Feb 14, 2014 1:07 am

Sorry to necro an old thread, but I thought these interesting facts might be somewhat relevant.

According to my old undergrad Biopsychology textbook:

Lack of Clear Cortical Representation of Pain
The second paradox of pain is that it has no obvious cortical representation (Rainville, 2002). Painful stimuli often activate areas of cortex, but the areas of activation have varied greatly from study to study (Apkarian, 1995).

Painful stimuli usually elicit responses in SI and SII. However, removal of SI and SII in humans is not associated with any change in the threshold for pain. Indeed, hemispherectomized patients (this with one cerebral hemisphere removed) can still perceive pain from both sides of their bodies.

The cortical area that has been most frequently linked to the experience of pain is the anterior cingulate cortex (the cortex of the anterior cingulate gyrus; see figure 7.21). For example, using PET, Craig and colleagues (1996) demonstrated increases in anterior cingulate cortex activity when subjects placed a hand on painfully cold bars, painfully hot bars, or even on a series of alternating cool and warm bars, which produce an illusion of painful stimulation.

Evidence suggests that the anterior cingulate cortex is involved in the emotional reaction to pain rather than to the perception of pain itself (Panksepp, 2003; Prince, 2000). For example, prefrontal lobotomy, which damages the anterior cingulate cortex and its connections, typically reduces the emotional reaction to pain without changing the threshold for pain.


According to my old undergrad Sensation & Perception textbook:

COGNITIVE ASPECTS OF PAIN

Pain is actually a subjective state with two distinguishable components: the sensation of the painful source, and the emotion that accompanies it (Melzack and Casey, 1968). The latter aspect of pain can be affected by social and cultural contexts and higher-level cognition. For example, reports of painful strains of the arm from tasks requiring repetitive motion spread rapidly in Australia during the 1980s--like a contagious disease--but they were communicated by workers who did nothing more than talk to one another about their experiences.

We have known for some time that areas S1 and S2 are responsible for the sensory aspects of pain, but researchers have recently been able to use new methods to identify the areas of the brain that correspond to the more cognitive aspects of painful experiences. In one study (Rainville et al., 1997) (Figure 12.11), participants were hypnotized and their hands were placed in lukewarm or very hot water (which activated thermal nociceptors). The participants were sometimes told that the unpleasantness from the water was increasing or decreasing, and their brains were imaged during these periods by positron emission tomography (PET). The primary sensory areas of the cortex, S1 and S2, were activated by the hot water, but the suggestion of greater unpleasantness did not increase their response relative to the suggestion of decreased unpleasantness. In contrast, another area, the anterior cingulate cortex (ACC), did respond differentially to the two hypnotic suggestions, by increasing or decreasing its activity according to the suggestion of increased or decreased unpleasantness. The researchers concluded that the ACC processes the raw sensory data from S1 and S2 in such a way as to produce an emotional response.

At the higher level still, pain can produce what Price (2000) has called "secondary pain affect." This is the emotional response associated with long-term suffering that occurs when painful events are imagined or remembered. For example, cancer patients who face a second round of chemotherapy may remember the first and dread what is forthcoming. This component of pain is associated with the prefrontal cortex, an area concerned with cognition and executive control.
"The most important human endeavor is the striving for morality in our actions. Our inner balance and even our existence depend on it. Only morality in our actions can give beauty and dignity to life." - Albert Einstein

User avatar
Brian Tomasik
Posts: 1107
Joined: Tue Oct 28, 2008 3:10 am
Location: USA
Contact:

Re: Sentience and brain size

Postby Brian Tomasik » Fri Feb 14, 2014 5:39 am

Thanks. :) Good background info. I hope you didn't spend too long typing that up. :P

The question on this thread is whether, if we shrunk those whole mechanisms to 1/2 their size, we would care 1/2 as much. It's clear that some parts of the brain are more important for pain than others, and reducing brain size by removing unrelated parts doesn't matter.

User avatar
Darklight
Posts: 118
Joined: Wed Feb 13, 2013 9:13 pm
Location: Canada

Re: Sentience and brain size

Postby Darklight » Fri Feb 14, 2014 10:29 pm

Thanks. :) Good background info. I hope you didn't spend too long typing that up. :P


You're welcome. Nah, it was pretty quick. I type fast.

The question on this thread is whether, if we shrunk those whole mechanisms to 1/2 their size, we would care 1/2 as much. It's clear that some parts of the brain are more important for pain than others, and reducing brain size by removing unrelated parts doesn't matter.


It seems like it wouldn't matter for our sensory experience of pain, but it may matter for our emotional experience of pain. So I think an important thing to clarify first is which of these two we actually care about. Someone may register pain signals, but might not "feel" like the experience was particularly unpleasant to them. I know from first-hand experience that certain psychiatric medication can do things like "zombify" a person, and make them have generally less affect. It's like experiencing everything with a veil of cotton around your head. Pain and pleasure both matter less to you, and the experience is almost like being less conscious than before. There are also other medications that can have the opposite effect, essentially making you feel more alert and aware of things, and able to process information better. This is actually partly why there's a black market for "study aid" stimulants. In my view, they don't just make you more focused, but actually seem to increase your IQ temporarily. It's also known that sleep deprivation lowers your IQ. Think about how you felt the last time you were really sleep deprived. How conscious would you rate yourself then compared to normal?

Anyways, if just the pure sensation of pain is what matters, then it follows that what we should be counting is the number of nociceptors that an organism has, or the ratio of nociceptors to other neurons.

But if what matters is the emotional experience of pain, then arguably we also have to count the neurons and connections that contribute to this distributed experience.

I'm just thinking of a simple example of how a seemingly unrelated area of the brain might have significant influence on emotional pain. Insects generally don't have a concept of their mother. Thus, they don't feel separation anxiety the way a human infant would. We would feel pain at the death of the loved one, mainly because we have the capacity to do a bunch of things, like represent close relationships and attachments to other beings, which itself requires an ability to model such beings, and then there's the faculties required to imagine what it would be like to be without them for the foreseeable future.

Similarly, I would think that animals that can pass the mirror test and are self-aware, like a pig, would suffer more emotional/psychological pain from feeling the threat of existential annihilation, than say, an insect.

Having more neurons and connections and more advanced structures that enable one to cognate at higher levels, arguably allows the experience of emotional pain and emotional pleasure to be much richer because of all these feedbacks and interactions and our imagination. Thus, the "quality" of the pain is greater, even if the quantity of physical pain is not. A human being is able to experience emotional pain in many more ways than an insect.

So I guess it's a matter of whether or not emotional/psychological pain stacks with sensory pain.

There's also the Gate Control Theory of pain (both my textbooks go into some detail about it but I'm feeling too lazy today to type it all up) which notes that there are ways for the brain to inhibit pain when it needs to do so. Examples of this include when soldiers have reported feeling no pain in the middle of battle despite severe wounds. So it's possible that the pain that we actually experience isn't just the nociceptors going "OW OW OW", but a complicated interplay of different emotional conditions.

So to take a stab at answering the question "if we shrunk those whole mechanisms to 1/2 their size, we would care 1/2 as much", I think, assuming size means number of relevant neurons, the answer is yes, because it is the emotional state or experience of pain, it's actual unpleasantness to us, that matters more than the pure sensory experience itself. I imagine this is why insects that lose limbs don't seem to show nearly as much stress as a mammal that loses a limb (not that I've tested this in any way). Both the insect and the mammal have neurons going "OW OW OW OW", but the mammal also has a complex of neurons going "OMG I'M GOING TO DIE!!! A PART OF ME IS MISSING!!! PANIC PANIC PANIC!!!" etc., while the insect is just going "OW OW OW OW OW, MOVE AWAY FROM BAD THING!!!"

So, it kinda depends on what you mean by "care". Do they care less in the sense that they are less conscious and intelligent and therefore can't cognitively comprehend the pain in as many ways as we can? Yes. Do they care less in the sense that they aren't as averse to the negative stimuli as us? Probably not. Given their much simpler world view, it's possible that physical sensations may actually matter more to them because they have little else to care about. So it all boils down to whether you think caring on a higher cognitive level matters or not.

It's also possible that insects, or more likely, micro-organisms are basically Braitenberg vehicles. Which is to say that they aren't sufficiently advanced to reach the threshold of consciousness, and their experience would more like the way we might experience sleepwalking.

Learning actually doesn't require conscious awareness. The ur-example here is H. M., the man who lost his ability to add things to long-term memory after an invasive surgical procedure. Even though he couldn't consciously recollect new things he learned, he was still somehow able to unconsciously learn certain motor skills. Also, there's the famous anecdote about how he repeatedly met a scientist "for the first time" and shook hands with him the first actual time, but because the scientist had a electric buzzer in his hand, in subsequent "first meetings" H. M. eventually started to refuse to shake hands with the scientist, even though he couldn't explain why. XD

So, what does this all mean? It means that even if insects can feel pain and care about it at a very basic cognitive level, it's probable that their conscious experience is very different from ours, and thus their emotional "feeling" of pain may be difficult to compare to our own. We can't therefore assume that the experience will be the same but more or less intense. They don't have a cerebral cortex (though they have a Pallium). They probably can't feel existential angst, or fears that are based on it. I don't think a bee stings an intruder only after a careful deliberation about its sacrifice for the greater good of the hive. It probably does so instinctively, without any consideration to consequences to itself. Though at the same time, the fact that such insects are social, suggests that it has a concept of "other bees who are my friend". That does require some degree of cognitive sophistication. So it may have a very simple concept of self, the part of the world that I can control directly, versus other things. But it apparently doesn't reason much about the self because bees generally are not very self-interested.

I guess where I'm going with this is that, while sensing pain doesn't scale with brain size, experiencing the negative emotions and cognitions that associate with sensing pain, does. There is probably a threshold of consciousness, but it appears to be very low. Nevertheless, there may be other thresholds that determine the quality or character of conscious experience. I think for instance, that the gap between mammals and non-mammals is probably greater than many people realize in terms of their cognitive ability to comprehend their happiness and suffering. But at the same time, I do lean towards thinking that even very primitive animals like insects experience some very basic level of consciousness, and that probably has to be weighted into our considerations.
"The most important human endeavor is the striving for morality in our actions. Our inner balance and even our existence depend on it. Only morality in our actions can give beauty and dignity to life." - Albert Einstein

User avatar
Brian Tomasik
Posts: 1107
Joined: Tue Oct 28, 2008 3:10 am
Location: USA
Contact:

Re: Sentience and brain size

Postby Brian Tomasik » Sat Feb 15, 2014 1:11 pm

Once again, I'm back only briefly. Your writing output is impressive, and I don't have time at the moment to keep up with all of it, sadly. So as usual, the fact that I don't reply doesn't mean I wouldn't like to given more time.

I think you should consider getting a website or at least a blog. Your writings are too good to be lost in the depths of Felicifia threads. Let me know what I need to do to persuade you. :)

Also, if you're on Facebook, you are warmly encouraged to add me there.

Darklight wrote:Nah, it was pretty quick. I type fast.

That's good!

Darklight wrote:Someone may register pain signals, but might not "feel" like the experience was particularly unpleasant to them.

Pain asymbolia is often mentioned in this regard. I think most people agree it's the emotional reaction that matters rather than the quale itself.

Darklight wrote:Insects generally don't have a concept of their mother. Thus, they don't feel separation anxiety the way a human infant would.

As Robert Elwood says, fish don't have lungs, but they still breathe. Some insects are extremely social and may even develop behaviors that look like isolation syndromes when separated from the group.

Darklight wrote:We would feel pain at the death of the loved one, mainly because we have the capacity to do a bunch of things, like represent close relationships and attachments to other beings, which itself requires an ability to model such beings

You miss food without being able to model food. Pain upon separation from others could be implemented at a more basic level.

Darklight wrote:Similarly, I would think that animals that can pass the mirror test and are self-aware, like a pig, would suffer more emotional/psychological pain from feeling the threat of existential annihilation, than say, an insect.

I feel like threat of existential annihilation is probably trivial in most animals compared with living in the moment. Maybe this is an extension of my own peculiar psychology, which doesn't fear death.

Darklight wrote:A human being is able to experience emotional pain in many more ways than an insect.

On the flip side, a human may feel less pain at some things than an insect. For instance, when you're sick, you know you'll get better. An animal doesn't know that. When your loved one goes away, you can look forward to his/her return and therefore suffer less in the interim.

Darklight wrote:There's also the Gate Control Theory of pain (both my textbooks go into some detail about it but I'm feeling too lazy today to type it all up) which notes that there are ways for the brain to inhibit pain when it needs to do so. Examples of this include when soldiers have reported feeling no pain in the middle of battle despite severe wounds. So it's possible that the pain that we actually experience isn't just the nociceptors going "OW OW OW", but a complicated interplay of different emotional conditions.

Yeah. :) I mentioned Gate Control Theory briefly on the opening post, though I now think some of what I said there is confused. For my current views on this topic, see "Is Brain Size Morally Relevant?"

Wikipedia says that GCT is outdated, though I'm not sure what has replaced it.

Darklight wrote:I imagine this is why insects that lose limbs don't seem to show nearly as much stress as a mammal that loses a limb (not that I've tested this in any way). Both the insect and the mammal have neurons going "OW OW OW OW", but the mammal also has a complex of neurons going "OMG I'M GOING TO DIE!!! A PART OF ME IS MISSING!!! PANIC PANIC PANIC!!!" etc., while the insect is just going "OW OW OW OW OW, MOVE AWAY FROM BAD THING!!!"

It's plausible evolution hasn't built in the mechanisms for the insect to act upon the broken leg, i.e., avoid using it until it gets better, tend the wounds, etc. It's also possible evolution builds in less deterrence against injury for shorter-lived creatures.

How about a mouse vs. a human? A mouse reacts very badly to a broken leg too. Here it's not clear there's a big difference in behavioral response by the organism itself.

Darklight wrote:Do they care less in the sense that they are less conscious and intelligent and therefore can't cognitively comprehend the pain in as many ways as we can? Yes.

Do you think people with cognitive disabilities feel pain in fewer ways?

Darklight wrote:Do they care less in the sense that they aren't as averse to the negative stimuli as us? Probably not. Given their much simpler world view, it's possible that physical sensations may actually matter more to them because they have little else to care about.

Yes. :)

Darklight wrote:Learning actually doesn't require conscious awareness.

More on that here.

Getting concrete with numbers: How many insects would you let be shocked to death to prevent one human from being shocked to death? (I'm assuming that insects react about as aversively to shocking as humans do, relative to everything they care about.) How about mice vs. humans?

User avatar
Darklight
Posts: 118
Joined: Wed Feb 13, 2013 9:13 pm
Location: Canada

Re: Sentience and brain size

Postby Darklight » Sun Feb 16, 2014 8:43 pm

Once again, I'm back only briefly. Your writing output is impressive, and I don't have time at the moment to keep up with all of it, sadly. So as usual, the fact that I don't reply doesn't mean I wouldn't like to given more time.


No worries. My two skills in life are probably writing and programming, so I've gotten very good at turning my thoughts into words with a minimal amount of editing. I actually probably output a lot more than I should be, as it means I'm probably procrastinating on my thesis. >_>

I think you should consider getting a website or at least a blog. Your writings are too good to be lost in the depths of Felicifia threads. Let me know what I need to do to persuade you. :)


I actually do have a website, but it's mostly a personal portfolio rather than a proper site about my ideas. I used to have a few personal blogs, but I found that, in my less mature years, I would waste a lot of time posting emo stuff rather than meaningfully productive things, so I eventually shut down my blogs to cull that unfortunate habit. Also, I feel like there's already a million blogs out there, and I would just be adding yet another blog to people's already long reading lists.

One of the reasons why I am choosing to post on Felicifia is because I like the idea of a Utilitarian forum, and want to support it with my writing. Rather than just making yet another website or blog on the Internet, I feel that my time might be better spent supporting existing ones and trying to help sites like Felicifia reach a critical mass of intellectual activity that will produce synergies, for lack of a better word.

As Robert Elwood says, fish don't have lungs, but they still breathe. Some insects are extremely social and may even develop behaviors that look like isolation syndromes when separated from the group.


Indeed, this is why I discuss bees later on. I'm also aware that apparently cockroaches appear to show distress at social isolation.

You miss food without being able to model food. Pain upon separation from others could be implemented at a more basic level.


Again, it's quite possible.

I feel like threat of existential annihilation is probably trivial in most animals compared with living in the moment. Maybe this is an extension of my own peculiar psychology, which doesn't fear death.


It's not -that- peculiar, given that Epicurus was apparently the same way (or at the very least he taught that the fear of death was silly).

On the flip side, a human may feel less pain at some things than an insect. For instance, when you're sick, you know you'll get better. An animal doesn't know that. When your loved one goes away, you can look forward to his/her return and therefore suffer less in the interim.


This does complicate things a bit. I admit, I'm making a big assumption that the added pain and pleasure from all the possible considerations that a human can have outweighs this somehow.

Wikipedia says that GCT is outdated, though I'm not sure what has replaced it.


I'm not aware of any new theory, but maybe my textbooks are out of date? I took those courses in 2007 though...

How about a mouse vs. a human? A mouse reacts very badly to a broken leg too. Here it's not clear there's a big difference in behavioral response by the organism itself.


A mouse is a mammal with a cerebral cortex, so it's not surprising that they would behave very similarly to humans. Most mammals are actually quite intelligent (stupid rats who fail to learn to press the lever for a food pellet in my Experimental Psychology: Learning labs, notwithstanding XD). I would definitely say a mouse probably feels emotional pain more similarly to a human than an insect, probably.

Do you think people with cognitive disabilities feel pain in fewer ways?


It depends on the cognitive disability. If it's comprehensive rather than something specific like left field object blindness, then they probably do feel pain in fewer ways, but the experience of pain itself may still be as intense as with non-impaired humans. There are some very unique disabilities, like Congenital Analgesia, where the person feels no sensory pain. I also think someone who's had a lobotomy, probably doesn't feel as much pain either. Again, it really depends.

More on that here.


I find myself constantly being impressed by just how thoroughly you've researched these things. I apologize if I often go over things that you've already looked at. While I've read a lot of your essays, I'll admit I've more skimmed them for the gist, than given them all the kind of due diligence they deserve.

Getting concrete with numbers: How many insects would you let be shocked to death to prevent one human from being shocked to death? (I'm assuming that insects react about as aversively to shocking as humans do, relative to everything they care about.) How about mice vs. humans?


Right now, I'm not confident enough about my knowledge of the subject to really make such calculations with certainty. On the one hand, I did say I consider humans to be utility monsters, which would suggest the numbers could go to arbitrarily large numbers.... but I hesitate to take that stand because it would potentially justify enormous amounts of suffering. So I guess I have to say, I don't know. It depends on whether or not a strict hierarchy of sentience can be justified, or whether the calculation function should weigh each sentient being according to their relative sentience level.

If it were the latter, we could assume that a human being is a million times as sentient as a insect. In that case the number could be a million insects to one human. And if a human being is a hundred times as sentient as a mouse, a hundred mice to one human. But again, I'm not even sure if this can be justified, any more than weighing them equally can be justified. My intuition is that a human should be worth more than a mouse or an insect, but I admit that could be a bias.

On the other hand, if we look at extreme edge cases, I really doubt we should give the same weight we give to humans, mice, or insects, to that of a photo-diode, which according to Integrated Information Theory would be the most minimally conscious thing possible. A photo-diode doesn't feel pain or pleasure at its "experience" of discrimating between different levels of light. So I'm inclined to think that there are certain thresholds that need to be reached before we start granting "sentient" things, moral worth. Thus, it may well matter more what structures are in the "brain" of these entities, than the pure number of neurons. It's imaginable for instance, to create an artificial neural network with billions of neurons that would rival a human brain in size, but all those neurons were purely used for image classification, rather than having any sort of pain/pleasure evaluation. Indeed, Google recently made a massive network that just classifies cats in Youtube videos. It had more neurons than a snail, but arguably, it was less sentient because all those neurons were for much fewer and simpler structures.

Thus, while the number of neurons is a good approximation in practice, it's only because evolved brain complexity in terms of structures seems to correlate well with number of neurons.
"The most important human endeavor is the striving for morality in our actions. Our inner balance and even our existence depend on it. Only morality in our actions can give beauty and dignity to life." - Albert Einstein


Return to “Animal welfare”

Who is online

Users browsing this forum: No registered users and 1 guest