Once again, I'm back only briefly. Your writing output is impressive, and I don't have time at the moment to keep up with all of it, sadly. So as usual, the fact that I don't reply doesn't mean I wouldn't like to given more time.
No worries. My two skills in life are probably writing and programming, so I've gotten very good at turning my thoughts into words with a minimal amount of editing. I actually probably output a lot more than I should be, as it means I'm probably procrastinating on my thesis. >_>
I think you should consider getting a website or at least a blog. Your writings are too good to be lost in the depths of Felicifia threads. Let me know what I need to do to persuade you.
I actually do have a website, but it's mostly a personal portfolio rather than a proper site about my ideas. I used to have a few personal blogs, but I found that, in my less mature years, I would waste a lot of time posting emo stuff rather than meaningfully productive things, so I eventually shut down my blogs to cull that unfortunate habit. Also, I feel like there's already a million blogs out there, and I would just be adding yet another blog to people's already long reading lists.
One of the reasons why I am choosing to post on Felicifia is because I like the idea of a Utilitarian forum, and want to support it with my writing. Rather than just making yet another website or blog on the Internet, I feel that my time might be better spent supporting existing ones and trying to help sites like Felicifia reach a critical mass of intellectual activity that will produce synergies, for lack of a better word.
As Robert Elwood says, fish don't have lungs, but they still breathe. Some insects are extremely social and may even develop behaviors that look like isolation syndromes when separated from the group.
Indeed, this is why I discuss bees later on. I'm also aware that apparently cockroaches appear to show distress at social isolation.
You miss food without being able to model food. Pain upon separation from others could be implemented at a more basic level.
Again, it's quite possible.
I feel like threat of existential annihilation is probably trivial in most animals compared with living in the moment. Maybe this is an extension of my own peculiar psychology, which doesn't fear death.
It's not -that- peculiar, given that Epicurus was apparently the same way (or at the very least he taught that the fear of death was silly).
On the flip side, a human may feel less pain at some things than an insect. For instance, when you're sick, you know you'll get better. An animal doesn't know that. When your loved one goes away, you can look forward to his/her return and therefore suffer less in the interim.
This does complicate things a bit. I admit, I'm making a big assumption that the added pain and pleasure from all the possible considerations that a human can have outweighs this somehow.
Wikipedia says that GCT is outdated, though I'm not sure what has replaced it.
I'm not aware of any new theory, but maybe my textbooks are out of date? I took those courses in 2007 though...
How about a mouse vs. a human? A mouse reacts very badly to a broken leg too. Here it's not clear there's a big difference in behavioral response by the organism itself.
A mouse is a mammal with a cerebral cortex, so it's not surprising that they would behave very similarly to humans. Most mammals are actually quite intelligent (stupid rats who fail to learn to press the lever for a food pellet in my Experimental Psychology: Learning labs, notwithstanding XD). I would definitely say a mouse probably feels emotional pain more similarly to a human than an insect, probably.
Do you think people with cognitive disabilities feel pain in fewer ways?
It depends on the cognitive disability. If it's comprehensive rather than something specific like left field object blindness, then they probably do feel pain in fewer ways, but the experience of pain itself may still be as intense as with non-impaired humans. There are some very unique disabilities, like Congenital Analgesia
, where the person feels no sensory pain. I also think someone who's had a lobotomy, probably doesn't feel as much pain either. Again, it really depends.
More on that here.
I find myself constantly being impressed by just how thoroughly you've researched these things. I apologize if I often go over things that you've already looked at. While I've read a lot of your essays, I'll admit I've more skimmed them for the gist, than given them all the kind of due diligence they deserve.
Getting concrete with numbers: How many insects would you let be shocked to death to prevent one human from being shocked to death? (I'm assuming that insects react about as aversively to shocking as humans do, relative to everything they care about.) How about mice vs. humans?
Right now, I'm not confident enough about my knowledge of the subject to really make such calculations with certainty. On the one hand, I did say I consider humans to be utility monsters, which would suggest the numbers could go to arbitrarily large numbers.... but I hesitate to take that stand because it would potentially justify enormous amounts of suffering. So I guess I have to say, I don't know. It depends on whether or not a strict hierarchy of sentience can be justified, or whether the calculation function should weigh each sentient being according to their relative sentience level.
If it were the latter, we could assume that a human being is a million times as sentient as a insect. In that case the number could be a million insects to one human. And if a human being is a hundred times as sentient as a mouse, a hundred mice to one human. But again, I'm not even sure if this can be justified, any more than weighing them equally can be justified. My intuition is that a human should be worth more than a mouse or an insect, but I admit that could be a bias.
On the other hand, if we look at extreme edge cases, I really doubt we should give the same weight we give to humans, mice, or insects, to that of a photo-diode, which according to Integrated Information Theory would be the most minimally conscious thing possible. A photo-diode doesn't feel pain or pleasure at its "experience" of discrimating between different levels of light. So I'm inclined to think that there are certain thresholds that need to be reached before we start granting "sentient" things, moral worth. Thus, it may well matter more what structures are in the "brain" of these entities, than the pure number of neurons. It's imaginable for instance, to create an artificial neural network with billions of neurons that would rival a human brain in size, but all those neurons were purely used for image classification, rather than having any sort of pain/pleasure evaluation. Indeed, Google recently made a massive network that just classifies cats in Youtube videos
. It had more neurons than a snail, but arguably, it was less sentient because all those neurons were for much fewer and simpler structures.
Thus, while the number of neurons is a good approximation in practice, it's only because evolved brain complexity in terms of structures seems to correlate well with number of neurons.
"The most important human endeavor is the striving for morality in our actions. Our inner balance and even our existence depend on it. Only morality in our actions can give beauty and dignity to life." - Albert Einstein