Hey Alex, thanks so much for putting this out there!! :)
OK, I will do my best to summarize my overall impression first, and then you can feel free to dig into which aspect you like. First, I appreciate the synthesis / bringing together of "aspects" of reality that you are offering. I believe that there is genuine value to be had there! The parts I found most accessible were *your* descriptions of Langan's theory. While I had (a while ago) heard about it, I had not engaged with it deeply. I suspect that *something* like this might be what I consider to be the "basis of experience" (the fact that "life feels like anything at all").
My own philosophical background includes having twice listened to John Searle's full lecture series on the "Philosophy of Mind." The "overlap" I see is in Searle's distinction between *real* thinking and "simulated thinking," which is his way of saying that living beings do (way) more than "computing information." Experience, for him, is more like digestion. If you imagine that you either *actually* digest a meal or you *simulate digestion* of a meal, you will (maybe at least) immediately spot a "difference": only *real* digestion does "change" reality in a way. Simulation is, at best, some kind of attempt at figuring out some kind of pattern, but not actually "living it."
That all being said, I find myself always quite skeptical these days when I am asked to consider that (AI or other based) "information alignment" *alone* can/would be the solution. The reason that I am skeptical is that I have now been in many groups in which the *informational alignment* (agreement on many important "thoughts," including value statements, preferences, who we would vote for, etc.) seems to converge on "perfect." And yet, when the time comes to "work together" (i.e., to actually "move in the same direction"), it all easily falls apart.
My hunch is that for any number of people to be able to move together, an alignment is needed that cannot be achieved through "information alignment" alone (including "value statements"). Instead, I assume it needs something "embodied." Something where people feel *immediate, physically relevant stakes* in the shared reality of "working together." In the TV show (and books) of "Game of Thrones" this was done by families sending some of their children to be wards at the seats of other big families' houses. This kind of ensured that if one family started to misbehave, the other could easily take revenge by killing their children (in their respective homes). That's probably too barbaric and drastic... But I feel that some kind of "investment" in a shared reality (and future) is maybe necessary for people to feel, well, "invested." It is interesting that I only recently looked up the etymology of the word "investment," and it made perfect sense to me... By offering something of value, you provide a "container" (a vest as it were) for a literal "object" outside of yourself (in a romantic situation, that object is the relationship itself, but in a financial situation, it could be a company, an entity in which several people "invest" time and money). The point is that there is *the mutually assured potential for sacrifice*. And I don't know how AI alignment would achieve that...
The part that I found least compelling was the one on "information making it between lives" (afterlife, etc.). At least in my own experience, I have not yet had anything that even remotely made me think this is real, and I also do not consider it necessary for me to experience meaning in my life -- but that might simply be a personal limitation of mine ;) All I can say is that to the extent that you (personally) feel this is *relevant* for the overall topic, I was missing somehow HOW it is relevant...?
The part on plant spirits was definitely something I have been thinking about for some time. Maybe three, four years ago, I at some point had a strong sensation of "being" an amoeba (or something more akin to one than I usually feel being). The experience stayed with me, and if I bring it to the forefront of my "mental model" of who or what I am in (or to) the world, I *can* feel that way. The feeling is quite interesting. Everything that my language and higher cognitive functions offer to me fades into the background, and the world reduces itself to something MUCH simpler. There is only something like an energy gradient that seems to push down on me, and I am compelled (by my nature) to work against it. Since I have finite internal resources, I also need to be "quick" (or efficient is maybe better) in ensuring that I (soon enough) find a new source of energy, and that I remain safe enough until I do. The first time I landed on this experience, it was quite uncomfortable, because I also felt THAT THIS IS *TRUE*. And because it is true, I had to conclude that I am far, far less sophisticated than I initially "wanted" to be. All that splendid "thinking" I can do is actually just there to help me be a better amoeba. But ultimately, I am not better (or even particularly different) from an *actual* amoeba. Just more sophisticated in a technical sense (but not a *moral* one!). That is also why we can be so easily "overtaken" in an evolutionary race (by even the smallest of life forms, like ebola)... Whatever "gains" we may perceive humans having made over the millions of years, the advantage is only "relative."
Instead, what I have come to believe since then is that *what matters is genuine experience* -- in the sense that IF I can make experiences that are truly *novel* to the universe, then I have *added* to its overall "value." And novel not in the shallow sense of tasting one human (say, sexually) after another, each being "novel." But rather a *truly novel pattern* of experience, some KIND of experience that has not yet been had.
In that sense, humans ARE different from other creatures, because our thoughts *DO* make a large part of our experiences, and things like literature and the beauty that can come from poetry (let alone music and arts) are, indeed, DEEPER than I imagine animal or plant experience of "beauty" to be. Which brings me, maybe, to my last point...
This is, probably, something few people would agree with me on: I *imagine* that if I can see *beauty* in whatever exists *as it exists right now* (including, say, the war in Ukraine or the Middle East), THEN (and maybe only then) can I truly live as I meant to live. Just as every tree and animal in the world can "co-exist with the war" without going crazy (having mental experiences of "ought" in face of the wars, experiences of "this is wrong!"), if I can find *some* beauty in it, then I can *transform that bit of beauty in my response to it*. That does not mean I have to like the war "as is" wholesale, but that I am also no longer trying to "get rid of it." Instead, I can focus on the *seed of opportunity* that lies within whatever situation I am facing, and take it from there.
OK, I realize, I probably didn't all that much "answer" the question of "what about Alex's podcast", hahaha. Instead this is probably more my philosophy -- but it may help you see where I find some of the things you say difficult to bring those into alignment ;)
Hey Alex, thanks so much for putting this out there!! :)
OK, I will do my best to summarize my overall impression first, and then you can feel free to dig into which aspect you like. First, I appreciate the synthesis / bringing together of "aspects" of reality that you are offering. I believe that there is genuine value to be had there! The parts I found most accessible were *your* descriptions of Langan's theory. While I had (a while ago) heard about it, I had not engaged with it deeply. I suspect that *something* like this might be what I consider to be the "basis of experience" (the fact that "life feels like anything at all").
My own philosophical background includes having twice listened to John Searle's full lecture series on the "Philosophy of Mind." The "overlap" I see is in Searle's distinction between *real* thinking and "simulated thinking," which is his way of saying that living beings do (way) more than "computing information." Experience, for him, is more like digestion. If you imagine that you either *actually* digest a meal or you *simulate digestion* of a meal, you will (maybe at least) immediately spot a "difference": only *real* digestion does "change" reality in a way. Simulation is, at best, some kind of attempt at figuring out some kind of pattern, but not actually "living it."
That all being said, I find myself always quite skeptical these days when I am asked to consider that (AI or other based) "information alignment" *alone* can/would be the solution. The reason that I am skeptical is that I have now been in many groups in which the *informational alignment* (agreement on many important "thoughts," including value statements, preferences, who we would vote for, etc.) seems to converge on "perfect." And yet, when the time comes to "work together" (i.e., to actually "move in the same direction"), it all easily falls apart.
My hunch is that for any number of people to be able to move together, an alignment is needed that cannot be achieved through "information alignment" alone (including "value statements"). Instead, I assume it needs something "embodied." Something where people feel *immediate, physically relevant stakes* in the shared reality of "working together." In the TV show (and books) of "Game of Thrones" this was done by families sending some of their children to be wards at the seats of other big families' houses. This kind of ensured that if one family started to misbehave, the other could easily take revenge by killing their children (in their respective homes). That's probably too barbaric and drastic... But I feel that some kind of "investment" in a shared reality (and future) is maybe necessary for people to feel, well, "invested." It is interesting that I only recently looked up the etymology of the word "investment," and it made perfect sense to me... By offering something of value, you provide a "container" (a vest as it were) for a literal "object" outside of yourself (in a romantic situation, that object is the relationship itself, but in a financial situation, it could be a company, an entity in which several people "invest" time and money). The point is that there is *the mutually assured potential for sacrifice*. And I don't know how AI alignment would achieve that...
The part that I found least compelling was the one on "information making it between lives" (afterlife, etc.). At least in my own experience, I have not yet had anything that even remotely made me think this is real, and I also do not consider it necessary for me to experience meaning in my life -- but that might simply be a personal limitation of mine ;) All I can say is that to the extent that you (personally) feel this is *relevant* for the overall topic, I was missing somehow HOW it is relevant...?
The part on plant spirits was definitely something I have been thinking about for some time. Maybe three, four years ago, I at some point had a strong sensation of "being" an amoeba (or something more akin to one than I usually feel being). The experience stayed with me, and if I bring it to the forefront of my "mental model" of who or what I am in (or to) the world, I *can* feel that way. The feeling is quite interesting. Everything that my language and higher cognitive functions offer to me fades into the background, and the world reduces itself to something MUCH simpler. There is only something like an energy gradient that seems to push down on me, and I am compelled (by my nature) to work against it. Since I have finite internal resources, I also need to be "quick" (or efficient is maybe better) in ensuring that I (soon enough) find a new source of energy, and that I remain safe enough until I do. The first time I landed on this experience, it was quite uncomfortable, because I also felt THAT THIS IS *TRUE*. And because it is true, I had to conclude that I am far, far less sophisticated than I initially "wanted" to be. All that splendid "thinking" I can do is actually just there to help me be a better amoeba. But ultimately, I am not better (or even particularly different) from an *actual* amoeba. Just more sophisticated in a technical sense (but not a *moral* one!). That is also why we can be so easily "overtaken" in an evolutionary race (by even the smallest of life forms, like ebola)... Whatever "gains" we may perceive humans having made over the millions of years, the advantage is only "relative."
Instead, what I have come to believe since then is that *what matters is genuine experience* -- in the sense that IF I can make experiences that are truly *novel* to the universe, then I have *added* to its overall "value." And novel not in the shallow sense of tasting one human (say, sexually) after another, each being "novel." But rather a *truly novel pattern* of experience, some KIND of experience that has not yet been had.
In that sense, humans ARE different from other creatures, because our thoughts *DO* make a large part of our experiences, and things like literature and the beauty that can come from poetry (let alone music and arts) are, indeed, DEEPER than I imagine animal or plant experience of "beauty" to be. Which brings me, maybe, to my last point...
This is, probably, something few people would agree with me on: I *imagine* that if I can see *beauty* in whatever exists *as it exists right now* (including, say, the war in Ukraine or the Middle East), THEN (and maybe only then) can I truly live as I meant to live. Just as every tree and animal in the world can "co-exist with the war" without going crazy (having mental experiences of "ought" in face of the wars, experiences of "this is wrong!"), if I can find *some* beauty in it, then I can *transform that bit of beauty in my response to it*. That does not mean I have to like the war "as is" wholesale, but that I am also no longer trying to "get rid of it." Instead, I can focus on the *seed of opportunity* that lies within whatever situation I am facing, and take it from there.
OK, I realize, I probably didn't all that much "answer" the question of "what about Alex's podcast", hahaha. Instead this is probably more my philosophy -- but it may help you see where I find some of the things you say difficult to bring those into alignment ;)