
In this Wednesday Night Live on October 15, 2025, Stefan Molyneux engages in a dynamic conversation about artificial intelligence, human value, and societal trends. Kicking off the discussion, Stefan welcomes viewers to the live call-in show, expressing his hope that everyone is having a wonderful week.
Stefan begins by sharing his thoughts on artificial intelligence, emphasizing that the intersection of AI and robotics holds transformative potential. He cites examples of AI successfully taking orders in restaurants and predicts that the fusion of AI and robotics will reshape industries, particularly highlighting how autonomous vehicles could affect jobs like trucking. He ponders whether future generations, accustomed to interacting with technology, will find it acceptable to be served by machines, contrasting that with the preferences of older generations who cherish human interactions in settings like banks and grocery stores.
As the conversation evolves, Stefan raises concerns about the displacement of workers due to automation, suggesting that as AI technologies advance, many people may become economically redundant. He reflects on historical precedents, such as the enclosure movement before the Industrial Revolution, to illustrate how society has adapted to significant economic shifts in the past but questions whether similar adjustments can be made in the wake of AI. His perspective leads to a broader examination of what happens to a society where a significant portion of the population cannot find productive work.
Various callers add to this rich dialogue. One caller challenges Stefan's views on moral behavior, leading to a discussion about free will, responsibility, and the complexities of individual choices in today's world. Another caller delves into the implications of societal transformations brought on by AI and expresses concern over the potential obsolescence of human labor in a landscape increasingly dominated by machines capable of performing tasks better and more efficiently than humans.
Stefan also tackles the implications of voting and economic systems, reflecting on how the attitudes of the lower class towards wealth redistribution can affect societal dynamics. He engages with callers about their thoughts on the morality of wealth, societal compassion, and the consequences of conditioning individuals to rely on government assistance versus empowering communities to support one another.
The philosophical inquiry deepens as Stefan discusses the moral landscape and societal narratives influenced by our responses to new technologies. He acknowledges the complex relationship between creativity, intelligence, and the evolving nature of work as we integrate more technology into our daily lives. The episode touches on the ethical considerations surrounding automation and the collective responsibility to ensure that no group is left behind as society transitions.
Towards the end of the episode, Stefan poses challenging questions about the human condition, morality, and the role of society in creating value and addressing the needs of its citizens. He urges listeners to critically reflect on their beliefs and values, emphasizing the importance of nurturing virtue and intellectual rigor as we navigate the ever-changing landscape of human progress.
In conclusion, this engaging and thought-provoking episode of Philosophy invites listeners to reflect on the intersections of technology, morality, and economic value in the modern world. With a spectrum of perspectives shared by the callers, the conversation reveals the depth of Stefan's commitment to encouraging critical thought and meaningful dialogue on pressing societal issues.
0:08 - Welcome to Wednesday Night Live
19:01 - The Complex Nature of Wealth and Poverty
21:02 - Understanding Noblesse Oblige
34:47 - The Burden of Unearned Gifts
39:00 - The Consequences of Compassion Fatigue
39:52 - A Call for Personal Responsibility
55:27 - The Moral Dilemma of Wealth Redistribution
1:10:35 - Forcing Morality on Others
1:13:26 - Impediments to Charity
1:19:25 - The Impact of Technology
1:23:30 - The Tsunami Warning
1:26:17 - Compassion Fatigue
1:33:53 - AI's Creative Limitations
1:41:16 - The Nature of Existence
1:53:29 - The Challenge of Morality
2:01:03 - Being Our Own Skeptics
[0:00] Good evening, good evening. Welcome, everybody. Half pinch punch, half mid-month, 15th of October 2025.
[0:08] Oh, gosh, another base five day, and I hope you're doing well. I hope you're having a wonderful week. Thank you so much for joining us here Wednesday Night Live for Philosophy, and I'm happy to take your calls, questions, issues, and challenges. If we could get some new blood in, I see you there, Diego. Happy to chat if we can get some new blood in, I'd be thrilled about that too. And, you know, I was talking to a friend of mine the other day, and he was asking me my thoughts about AI. Now, of course, I've done a couple of presentations on AI, and I have some technical knowledge and understanding of it, and its potential, and so on.
[0:51] And I'm going to share a couple of thoughts with you about AI. And of course, I'm happy to get your thoughts as well. I'm checking here in the chat, get your thoughts as well. So to me, it's not just AI, it's AI plus robotics. That is going to be the big thing. So a friend of mine's daughter answers the phone at a restaurant. And of course, the question is, can AI answer the phone in a restaurant? Yes, it can. It can take the orders. It can ask questions. It can speak in any one of a hundred different languages seamlessly. It can recognize accents and transpose them to something a computer can understand. And it can do all kinds of funky, wild, weird, and wonderful stuff. And of course, it can quote the price, estimate the pickup time based upon what's going on in the queue. It can do some pretty wild stuff. Can it do it better than my friend's daughter? Well, certainly in terms of language abilities, yes.
[1:58] What about trucks, right? As you know, there's a million people in America are truckers. And what if, particularly, you know, with immigration, it's interesting, right? Because one of the things that happens with immigration is you get people who haven't grown up driving in the winter. I mean, if there's ice and black ice and slippery, you know, slip sliding Paul Simon stuff away, then you get people who haven't grown up and haven't sort of received that instruction, didn't learn how to drive with icy conditions, you know, driving in winter is slow motion. Everything's slow and all of that. And AI can adjust to and adapt to all of that. Elon Musk is talking now about how.
[2:48] You can push a button and get your Tesla to come and pick you up. You got a bunch of packages at the mall, a bunch of grocery bags. You can, you know, push a button. It'll start itself up, drive from the parking spot over to you, open up the back and you can load up. It's wild stuff. So AI plus robotics, have you seen these robots that can pick fruit? You've got robots that can pick strawberries, robots that can pick raspberries and robots that can pick grapes more gently than the average person, because grapes are tough, right? You got to get in and feel there and see it. And they can do all of that. And there is a rather alarming implication that's going on at the moment, which is what happens when AI plus robotics. Now, robotics isn't just, you know, Terminator arms and being able to do skateboarding stuff and all of that. robotics is any interface to me between the CPU and physical equipment. So I would view an AI truck as kind of like a robot in that it's AI combined with physical equipment to make a person redundant. My wife and I were talking today at lunch about how she's like, I would never want to be served by an AI robot.
[4:05] Or because we were chatting with the, it was actually the owner of the restaurant we were chatting with and having a great time chatting about life and restaurants and all of that. And she's like, well, you couldn't do that with AI. And she says, you know, I like going to the bank and being able to chat with people I know. And I like going to the grocery store and chatting with people I know and all of that. And I said, of course, as a mature husband, I'm like, oh, so you don't enjoy chatting with me? Anyway, it's delightful how attractive that is to people. Oh, let me stay in my bubble, please. It's a big fleshy dome bubble of spottiness and pink thoughts. But... I don't know, the next generation grow up interfacing with phones and tablets and computers, and would they mind? Would they be bothered? Would they dislike.
[4:56] Being served by robots? I don't know. I mean, to me, if it took 20% off the price of the meal, why not? I mean, there are restaurants, I think it's, there's a McDonald's, I think it's in Japan, where everything's automated, even the cooking. So there is a very interesting problem. And I'm going to do more of a presentation on this. And again, I'd love to get your thoughts on all of this as a whole. But I think that there is a very real and interesting phenomenon by which 80% of people might be rendered economically non-valuable.
[5:42] Because in the past, there was always some place for people to go if they were made, quote, redundant or, right? So one of the things that happened in the 18th century, and you should read my novel Just Poor about this, because I'm always fascinated by what comes before the big historical thing that paves the way towards it, because that's where the real gold is intellectually. And so I'm, you know, I wrote a novel about what happened a generation before the revolution in Russia. It's called Revolutions, and you can get it at freedomain.com/books. And then I also wrote a novel about what happened before the Industrial Revolution.
[6:20] Before the Industrial Revolution was the enclosure movement, where basically land was consolidated because it had all gotten fragmented because of inheritances and primogenitor and so on. And so land was rationalized, it was centralized, it became part of the free market, and the people who were the best at growing crops got the most land. And through that process, you ended up with 10, 15, 20 times increase in agricultural product production, but you just needed fewer people as a result to be on the land. So you had this kind of displaced or dispossessed group of people who used to work the land and were serfs, and they flooded to the cities, to the towns, the urban centers. And that mass of ready labor was, in many ways, the sort of germinated seed of the Industrial Revolution. You can't have an Industrial Revolution if everybody is required to produce food.
[7:18] For the country, right? I mean, even as recently as 1900, 70, 80% of Americans were involved in farming. Now it's two or 3%, right? So you need an excess pool of labor for the Industrial Revolution to occur. Basically, you need the end of slavery and serfdom for the Industrial Revolution to occur, but you need that. And so in the past, when you had massive increases in deficiencies, in particular economic areas. I mean, it's happened in war because it used to be that you just needed massive amounts of people to throw into the, you know, whirling death blades of eternal combat, right? You just needed the war of attrition. The people, the more population you had and the more money you could print, the longer your war could go on. And it came down to, you know, if you're killing 1.1 of them for every one that they kill of yours, then you just, you win over time, but it takes a long time. It's kind of what I talked about the other day, well, yesterday with the Russia-Ukraine conflict.
[8:19] So, you know, 10 million people in World War I, 40, 50 million people in World War II. Of course, in World War II, a lot of non-combatants, of course, died, were killed. But in World War I, it was mostly combatants and sort of trench warfare. And even now with, you know, with drones, with airstrikes, air superiority, and space lasers and things like that, you don't need as many people to win a war because it's technological based. So even the physical requirements for war have diminished. So, in the past, when you had a particular sector of the economy that was closing down in terms of its soaking up of excess labor, there was places for people to go. If you couldn't get into the army, you could go work on the farm. If the farm required fewer people, you could go to the city and get a job in a factory.
[9:21] But with AI, that's the big question. Is that the end of the road for people who can't or won't produce a lot of value? Now, I mean, there's two groups of people, right, who don't produce a lot of value. Those we can have a lot of sympathy with, and those I think it's unfair to have a lot of sympathy with. So there are people who just, they're just not very smart. And, you know, maybe you can budge a certain amount in IQ, but it seems to be kind of a fixed thing. It's not more nutrition. There were Dutch children who were virtually starved to death during World War II, and they grew up to have perfectly normal IQs relative to the rest of the population. So it's not a matter of nutrition. I do think it's a matter of good education, but good education can only take you so far. Singing lessons will make just about everyone a better singer but they won't turn people into really good singers.
[10:26] So people can't produce much economic value because they have an IQ of 80 or 85 or 90. So 85 to 90 you can produce economic value but you're kind of easily replaceable by automation. right? So, in the past, when giant sectors of the economy closed down, there were other sectors that were opening up. So, if you're just not that smart, we can have a lot of sympathy. It's not your fault. Like being short, it's not your fault. And we should have a lot of sympathy, a lot of compassion for people who, through no fault of their own, just aren't that smart.
[11:05] There is, of course, the other group of people, which is, if you've ever seen, it's a pretty good movie, Jack Nicholson movie called Five Easy Pieces, where he plays a guy who's totally brilliant, but is kind of pissing his life away for reasons of passive aggression and so on. This is the, if you've ever seen the famous scene where he wants some toast, and the waitress is like, I can't make you toast. And he's like, just make me a chicken salad and hold the chicken, right? And hold the salad. And it's been, I don't know, probably, 40 years since I saw the movie, but I do remember that he was like really great at piano and really brilliant and had all of the opportunities in the world, but then he chose to just sort of work on these roughneck oil rigs and kind of do nothing with his life. Now, of course, he was still producing economic value, but the people who, you know, they self-medicate bad childhoods with drugs rather than with self-knowledge, which is not self-medication, but releasing yourself from the pain.
[12:01] And there are people who have high potential, but who piss it away, which, you know, if you want to know, a little lifting the lid into the old StefBOT thing. I've always had a, should we just talk about our fears? Why not? Why not talk about our fears? We could be honest, love to hear your fears too. But I've always had a genuine terror of getting to the end of my life, looking back and saying, what did I do? What did I achieve? Did I just watch a lot of TV and play a lot of video games and consume a lot of other people's creativity? What did I do? What did I create? What did I? It's like when I'm writing novels, there's always this battle between the sort of wild, fiery language that erupts in me to describe things and trying to wrestle that into writing books that do some good in the world, that promote virtue. When I was writing Just Poor, Just Poor words of really scathing criticism of socialism. And...
[12:59] The story and the characters want to pull me in one direction, but the sort of iron will of wanting to turn my artistic talents too good means we've still got to talk about socialism. Oh, but we want to be authentic to the characters and blah, blah, blah. It's like, yeah, but we still got to, the only reason the book exists to critique socialism. So I've got this sort of iron will of promoting virtue at the same time as this fiery language center that wants to just come up with the wildest and most beautiful language known to man and trying to wrestle the two together to forge, something that is both artistically powerful and does good in the world. It's a sort of big challenge. So I've always had this sort of fear that I was just gonna piss away my talents, piss away my abilities, be frustrated by being a sort of pro-free market, pro-objective morality artist and the constant attacks and rejections from art, the people who have power, the publishers and movie makers and so on in art. And I don't know how people piss away their talents, but they do. It happens all the time. I grew up with some people who had brilliance, absolute brilliance, and did nothing with it.
[14:18] God, that is wild. It's like inheriting $10 million and living life as a complete miser. You know, Howard Hughes, giant Kleenex boxes on your feet and fingernails that go on for three days. So those people, you know, maybe they can adapt. Maybe they can figure things out. The people who aren't so smart, I'm torn. Honestly, I'm a fork in the road. I'm a fork in the road. Part of me is like, oh, sympathy, and it's not their fault, and blah, blah, blah. And part of me is like, eh. Eh.
[14:56] I mean, the poor and the less intelligent, I always try not to underestimate them and always try to, I certainly don't want to infantilize them. And I don't want to pretend that they're not cunning because I get what you lack in intelligence. Often people make up for in a kind of cunning, just kind of a chilling thing. So they know how to game or work a system so that they can, And, you know, like I'm not that smart, but I sure know how to fake an injury in order to get my disability pay. I'm not that smart, but I know if I put my kid on psychotropics, I'm going to get more money in social security benefits. I'm not that smart, but it goes back to many, many years ago. I was in a Northern Ontario town, and I was at a playground playing with my daughter. She was very little. And there were these two women. Not smart, you could see, not smart. But they were trading tips on how to game the system. Well, if you apply for this, and then you combine it with that, you can get an extra $300 a month. And over here, there's this system, you apply it with that, you get free groceries, and here's how to get the free dental. And if you want braces, you need braces, you can apply here. Like, they're just trading stock trips like Nancy Pelosi's hand puppets.
[16:16] I'm like, not that smart, but cunning, really cunning, know how to maximize. Advantage, know how to game and work systems.
[16:31] So, part of me is like sympathy, and this is where the UBI thing is going to come from. There's no work for people. There's no work for people. You know, the idea that we're just going to import massive amounts of human bodies because we need laborers, which is, right, somebody posted on X that, massive reductions in the, say, Scottish population under the Black Death. The Black Death, while an absolute disaster for Europe, did bring us about the agricultural revolution and did bring us about the industrial revolution and the modern world, because the Black Death reduced population by a third, a half, or sometimes even more, and particularly among, the poor and the laborers, and therefore the remaining laborers had a lot more leverage, and they could negotiate for better working conditions, pay in many areas in Europe went up three times, and they could negotiate for an end to serfdom and being able to buy and sell their land rather than just being livestock tied to the purchase and sale of land by the Lord. And so, while it was a catastrophe, it also gave birth in many ways to the modern world.
[17:44] And so, it's one of these things where if leaders were intelligent, wise, or vaguely moral, as, Plato said the world will never know peace until the kings become philosophers or the philosophers become kings, which is, again, not likely to happen anytime soon. But, the idea that you're going to bring in a large number of laborers at a time when AI, is about to render a lot of people's labor unnecessary, I mean, it seems to me far from an accidental recipe for social disaster. It's kind of obvious. The elites have access to much more information than you and I, so they know a lot more about what's coming down the pipe.
[18:29] So part of me is like, um, it's not their fault, be sympathetic, be kind, be generous and all of that. And part of me is like, you know, if you've made any kind of coin over the course of your life, you know, you know that, I mean, you just know statistically, right? That most of the tax payments are made by the top few percentage of the population and the bottom half really don't pay any taxes at all.
[19:01] And so the sort of cunning poorer people are very, very happy to vote away the labor and wealth and savings of the wealthy. Very happy to do that.
[19:21] I mean, I try to not position myself in any particular, you know, lower class, middle class, but I try to sort of look at things objectively. And the wages of sin is suffering. Wages of sin is what you get paid for sin is suffering. And I don't know, you know, I wrote about this in my novel, The Present, which you should definitely check out. It's free, freedomain.com/books. But when the rich and the powerful look at the cunning and greedy brutality and soulless, greedy avarice of the poor to get everything for nothing and to treat the wealthy and those with savings as livestock to be captured, chained, and exploited in order to often pay for bad decisions made by the poor, it's kind of hard not to see the rich and the poor as kind of in a state of nature. That wouldn't be the case in a free society, but we're sort of talking about a state of society, that what compassion is left for the poor among the rich, given that the poor have voted to take away the property of the rich for 90 years?
[20:42] 70 years. No, I would say 90. Mid-30s. Mid-30s was when the beginnings of the Great Society, well, prior to that, it was FDR, chicken in every pot, the New Deal, the New Deal, and there was the Great Society under LPJ.
[21:03] And so starting in the 30s, peaking in the 60s, continuing in a slower but significant upward trend ever then, ever since has been a massive pillaging of those who've been smart, worked hard and saved money and produced a lot of value, the pillaging by the poor and the lower middle class.
[21:26] What does that do to the wealthy when they are asked to have a lot of compassion for the poor? I don't know. Not like I talk to wealthy people on a regular basis, but I would imagine, sort of putting myself in that elevated perspective, I would say that what it does, is it's like, F me, F you. Exploit me and my kin for three generations. I don't care anymore what happens to you. The fact that you've just run to every politician who's promised to steal from us and give to you the blood, sweat, labor, and toil, and inheritance of our 60-hour, 70-hour workweeks, the fact that you've gone to every politician who said, make the rich pay their fair share. We pay most of the taxes, these rich people would say. So given that every politician.
[22:24] That promises you to take from us and give to you, you have voted for, you have cheered on. What compassion is left? Now, there's a great price to be paid for exploiting others. A great price to be paid for exploiting others. One of the things that happens when you exploit others is you have to dehumanize them. Very foundationally, very fundamentally, you have to dehumanize them, which is why the rich are caricatured, right? Fat cats and monocles and bulging waistlines and the lie about capitalists banging every, secretary and worker in the factory and so on. It just, I mean, it happened, but it certainly wasn't common. You may be thinking of the entertainment industry in the 21st century, not the capitalists of the 19th century who were Christian. So, what compassion is left from the wealthy to the poor. If you exploit people, you also don't grow up. Growing up is when you take responsibility for your own decisions. Right as you've heard me say, I had my life's work torn to shreds a little over five years ago.
[23:42] Had 15 years of work erased. Have I ever blamed anyone but myself? No, I take responsibility for the choices that I make. But when you exploit other people and you demand that they pay for your mistakes, you just never grow up. You remain a whiny, belligerent, manipulative child in adult form, and you are easily programmed to resent other people who do better, who make better choices, or who are smarter.
[24:21] Dehumanization, immaturity, entitlement, right? In order to exploit others, you have to feel entitled to the fruits of their labor. You have to enslave them. Ironically, you have to enslave others. And in particular, you have to enslave the next generation through a combination of deficits, debt, and unfunded liabilities. I mean, the boomers claim to be so guilty about slavery. You know, one of the ways that I would believe that the boomers are guilty about enslavement, if they hadn't enslaved everyone who came after them, with all of this excessive spending, they don't care about slavery at all. Or, like, I guess a lot of people, they care about the slavery that happened 150 years ago, not the slavery they're inflicting on everyone else in the future.
[25:12] So, I don't know how much compassion there is among the wealthy for, not the poor because they're poor, but for the poor because they are avaricious, greedy, dehumanizing, exploitive, and ridiculously immature. You know, when I was younger, and I'll take your calls in a second, I appreciate your patience here. So when I was younger, I always used to, when I was in a social situation or on a date, meeting new people or whatever, I would always look, and in particular, this is true for dating, I would always look for, does this person take responsibility? Responsibility. Does this person take responsibility? You know, there's this old meme that like only a few percent of men are narcissists, but every woman claims to have dated narcissists. Maybe not. Maybe not. Maybe not. Maybe not.
[26:24] So I would always, um, you know, I'd look for a woman who, uh, would say, yeah, I dated the wrong guy and here's why I did therapy and I figured everything out and I'm going to make better choices. And I, I, it was my responsibility. That's what I wanted to hear. Like the woman I met who I took out for a coffee many, many moons ago, decades ago. And, uh, she said, oh yeah, my, my boyfriend ran up $17,000 debt on my credit card and then took off. I said, my gosh, were there any signs? No, he was totally normal. I'm like, okay, no, no, he wasn't. No, he wasn't. It's no perfect camouflage for crazy. It's always got signs. It's always obvious. But rather than saying I was taken in by his height and dark looks and charm or whatever, or, you know, I've talked about this with men too countless times on the show. My girlfriend was crazy. Were there any signs? No. And then you start looking for the signs, and there are signs everywhere. Big signs, big signs. People who take responsibility. I just look for people who take responsibility. You can't have relationships with people who don't take responsibility, because everything that goes wrong, their fragile egos can't handle the fact that they might have made a mistake. They need to be perfect, or they're enraged, and nobody's perfect, which means you're in a constant state of imminent IED.
[27:52] Improvised emotional destruction. So you can't have relationships with people who don't take responsibility. And this is a little bit more true among women in terms of dating. A woman who has a kid but the wrong guy, hey, you made a mistake. You made a terrible mistake. They can't, often can't process it. They can't process it. And it just slides like water off a duck's back. They're just opposing giant magnets. You can't get them together. And you can't have relationships with people who don't take responsibility because there is no person there. There is only excuses, avoidance, projection, and exploitation.
[28:38] All right, let's get your comments. I'll take a call. Chris says, I think you're probably correct. The majority of the wealthy understand well that it takes one's time to accrue money and money is property, an extension of their person. So when the poor votes to steal more money from the wealthy, it could be seen as stealing part of their lives, an underlying mortal battle. Yeah, it's enslavement. It's enslavement. Absolutely. We talked about this the other day when somebody makes 40 bucks an hour and is a $40,000 car, somebody steals their car, they've just stolen all that time from them. Just stolen all of that time from them. It's really terrible. If I make 10 bucks an hour and you steal $10 worth of value from me, you've just enslaved me for an hour of my life. That's no good. But you don't get to kidnap some woman and take her on a date for an hour of her life and then release her back where you got her and say, hey, no harm, no fail. And whether you kidnap someone proactively or retroactively is not particularly material. And so the poor have been lured in by the inevitable Marxists. The poor have been lured in to say that the wealthy are only wealthy because they stole from us.
[30:02] And then the welfare state and things like that are sort of emotionally perceived as self-defense or blowback for the rich stealing from everyone. But all it does is it erodes any reciprocal goodwill from the wealthy. I mean, I think that the wealthy fear the poor because the poor outnumber them. But I think, unfortunately, because of the poor voting to take away the property of the rich and thus making everyone poor, tax the rich. Wasn't that on AOC's back when she was in address some years ago? So the poor are constantly voting in people who want to take from the rich and give to the poor. And that has destroyed in the modern world, any conception or obligation of what's called noblesse oblige. Noblesse oblige is a really important topic. And I'll just do this topic. And again, love to hear from you. Noblesse oblige is really important, which is to say that if you are given particular gifts that you did not earn, you should use them for the betterment of mankind. I have taken this, you know, I'm just saying this as an example for my own life, right? So hopefully you'll forgive me what may seem like vanity or whatever, but it really is humility. The humility is, I have extraordinary gifts of exposition, rationality, debate, analogy, and communication. Extraordinary gifts.
[31:27] I mean, I've solved the problem of secular ethics. It's now been 20 years. Nobody's overturned it. No, countless people have taken runs at it. And you've heard me debate people live on this show. They always have to admit rape, death, assault, and murder can never be universally preferable behavior. It's the biggest achievement in the history of philosophy. And I say this with humility because the only reason that I was able to do it is not because I'm some great guy that's got nothing to do with it whatsoever. I just happened to have been accidentally gifted this brain that I have. I didn't earn it.
[31:59] I didn't win some game of skill or chance before being born to be gifted the brain that I'm gifted. It just happened to assemble itself within me through no virtue of my own. The analogies that come to me, the reasoning that comes to me, I mean, I sat down and said, I'm not going to get up from this table until I've solved secular ethics because I'd done already 20 years work in philosophy. So I knew it needed to be done. And I knew or believed that I was capable of doing it. And lo and behold, I did it. And I only did that after I dodged out of the inconsistencies of the objectivist proof of ethics. So when I say that UPB is a great achievement in the history of philosophy, I say that not because I want anyone to think better of me. I don't care what people think of me and don't think of me at all. That's for the people in my life to do. But what I am saying is that the noblest oblige for me was that I was given a great gift of reason, analytics, and philosophy. And because I was given that great gift, it is not mine. It is not to serve my ends alone. It is to serve the good of humanity as a whole.
[33:11] All of humanity combined to give us all the brains that we have. Four billion years going back in time. A million years for human beings. So I happen to have been gifted a great brain, which just kind of runs in the family, on both sides of the family. But the artistic, on my mother's side, it tends to be more artistic. On my father's side, it tends to be more analytic. And those two kind of came together in me. And this is an alignment of the planets and so on. And again, I say this with absolutely zero pride. I didn't earn it. It's accidental.
[33:47] And because it is not mine, but rather I happen to be the incredibly lucky, well, you could say lucky, you could say unlucky. I happen to be the singular beneficiary of this kind of capacity. It's not mine. I didn't earn it any more than I'm, I was really good at inventing that kidney, that liver, those lungs. Boy, what a great time I had at the whiteboard of the blueprints. No, I just inherited the kidney, the liver, the lungs, and the brain from prior evolution and the accidental assemblage of genetic factors. Knowing that intelligence is fairly overwhelmingly genetic gives you great humility. And I've always really kind of loathed people who are loathed. I'm not kidding about that. Loathed people who are proud because they're tall or have blue eyes or great hair or, you know, whatever handsome features. Like, you didn't earn that.
[34:47] Yes, but I worked out and I, yeah, but you did that because you're good looking to begin with. So.
[34:56] The noblesse oblige is if you happen to be in the fortunate possession of important unearned gifts. My virtue I've earned, my virtue I've earned, but the raw horsepower I did not earn. And I said this the other day, that answers come through me. You see what I mean? Answers come through me. Somebody asks a question, the answer assembles itself in my head in perfect categories. I didn't earn that. I've worked on it, and I've tried to shape it towards the good, but I didn't earn that. I did a dream analysis the other night.
[35:42] Where, we haven't done one of those in a while, right? So I did a dream analysis the other night where I absolutely nailed this guy's life in about 20 minutes, just from the dream, never met him before, never heard the dream before. I didn't earn that. That ability, that capacity, I did not earn that. Because I didn't earn it, it is a collective resource to be used for the good of mankind. That's kind of the noblest oblige. And a lot of people who are wealthy feel the somewhat accidental good fortune of their wealth. Of course, there's skill. Of course, there's will. Of course, there's virtue involved in becoming wealthy. But there's also accidental genetic inheritance. There's also, you know, those freakazoids who can get by on two or three hours of sleep a night. There's some genetic switch for that.
[36:33] There's some coincidence of meeting the right people at the right time, particularly if you don't come from wealthy environments. And so there is a certain amount of humility for those who are honest with themselves about what they've achieved in life. There's a certain amount of humility in that and a certain amount of I owe an aspect of my good fortune to the world as a whole. And this is where a lot of philanthropy comes from, a lot of generosity, a lot of kindness, a lot of helping out others, a lot of mentorship comes from that. I was mentored in the business world, not so much in philosophy, but I certainly mentored in the business world. So when I became a business leader, I in turn mentored others, pay it forward, right? I love the gift of life. Therefore, we worked hard to have children.
[37:24] But that noblesse oblige, I gotta tell you guys, I think it's gone. I think the elites have detached themselves from the endless squawking, greedy moors of the poor. And I think that the elites don't particularly care what happens. I mean, they'll put them on some you be higher or something like that. But I don't know, man. I don't really think so. I don't really think that. I think that the goodwill from the wealthy to the poor, from the more competent to the less competent, from the more able to the less able, from the smarter to the less smart, I think all of that has largely gone. I think, I mean, there's a huge amount of fatigue in the world about dysfunctional behavior. You know, Lord above people, I can't say this passionately enough.
[38:29] When the poor in the West get many multiples of the entire world's GDP every year that existed a hundred years ago, multiple multiples of the world's GDP are poured down the throats of the poor every year in the West. And what happens? Are they getting out of poverty? Are they making better decisions? Are they grateful? Nope. More entitled, more angry, more greedy, voting in more thieving politicians.
[39:01] If you cannot restrain your greed for the unearned.
[39:09] The wages of sin is death. And I think that there was this great experiment among the able and the wealthy, which is to say, let's take our goods and then just like the white man's burden, like the British Empire nonsense, this terrible, terrible idea. Well, let's just take our benefits and spread them around. And that has now been tried, for 90 years. How are we doing? Well, we have more poor. We have shredded families, single mothers everywhere. Educational standards have collapsed.
[39:50] Entitlement has grown. Anger has grown. Resentment has grown.
[39:53] Rage has grown. violence has grown crime has grown, and there's no appreciation there's no thanks there is only eternal greed for more and more.
[40:12] And at some point, the wealthy and the able will say, which they should have said, of course, thou shalt not steal this foundation of Christianity. And it's really sad to me that Christianity got behind this whole thing, this whole mess, hundred and thousand percent, it seems. But I don't think there's any goodwill left among the elites. I think they look at this and say, well, that was screwed up, man. And we're inheriting a mess that we didn't vote in, right? I mean, again, I wrote about this in the helicopter scene in my novel, The Present, a couple of years ago. There's no goodwill left. And the sort of depopulation agenda and the take the vax and all that. I think it's just a pitched battle now. Honestly, I just think it's a pitched battle now. So. The Dream Analysis is show 6-1-1-7. 6-1-1-7. All right. Let us get you here. Oh, FreedomAin1 on YouTube. Subscribe to me there as well.
[41:34] Ah, I've missed you on YouTube, and I'm glad I found you again. Ah, nice to see you. Nice to see you. All right, let's get to your questions, comments. We're going to go with some, people we haven't heard from before. I think we've got Blue Lagoon. What's that? Just the word Chris Atkins? No. Oh gosh, Brooke Shields. There was a Blue Lagoon movie, wasn't there, many years ago? I've never seen it. Brooke Shields has creeped me out too much from all of that early sexual exploitation. Blue Lagoon, if you wanted to unmute, I'm happy to hear your thoughts.
[42:12] Yeah, so I was going to say I'm not sure if... Yeah, I'm not sure if just being useless and because the robots basically replace you and there's nothing you can do about it, I'm not sure wanting to survive is necessarily enslavement. I feel like it's just what you have to do if you're that poor and you have no way of working because you can't get in representation.
[42:42] So why can't you work? What are you talking about when you say you can't work? I'm not disagreeing. I just want to make sure I know what you mean.
[42:49] Yeah, so what I'm imagining is that we get to the point where there are humanoid robots that are better at doing things that people that are relatively unskilled can do and they can just do it 24 hours a day.
[43:03] Yeah, no sick leave, no pregnancy, no unionization, no tiredness, no pee breaks. Like, for sure. Yeah, go ahead.
[43:12] Yeah, I was basically just going to say, yeah, I mean, like these people... I mean, the bots would just be better at it than anybody, but I mean, what can you do if you're that person? You could try to upskill and become some kind of maintainer of robots, but how many of those are there?
[43:35] My question is, and I don't know the answer to this, right? So Charles Murray, who was on the show once actually, but Charles Murray tweeted about how in America, it was like close to a 90% literacy rate among the white population in 1850. Now, one of the things that's amazing about America is that, you know, Americans sort of portrayed as dumb and so on, and they're not, I mean, especially if you break it out by demographics, but a book, well, if you look at Shakespeare, of course, and you look at Herman Melville, and you look at Charles Dickens, two of the, sorry, three, I can do math, I really care, three of the most famous, I was thinking of novelists, two of the most famous novelists, one of the most famous, well, the most famous poet and playwright in the world, in ever, right, Shakespeare. Now, people went.
[44:26] To Shakespeare. And the most popular book by far in the 1850s in America was Moby Dick, which is a dense, complicated read. And of course, Charles Dickens was the most celebrated author in human history. His public readings were phenomenal, attended, like when he sailed to America, he was greeted by thousands and people when he wrote, was it Little Dorrit or whatever it was, where the child is ill. Oh, is he going to make it? He's serialized. He was incredibly famous. And his books are complex, complex.
[45:04] So Shakespeare, of course, amazingly complex, still popular with the masses. And if you look at Herman Melville, Moby Dick is very complex and was revered among the masses. If you look at Dickens, again, the most famous novelist in human history, complex books, beloved of the masses. So I don't know. I don't know if the average person, and I'm not talking about somebody with an IQ of 75 or 80, right? I'm talking about sort of the average person. Let's say that you have been lazy. Let's say that you did have potential. Let's say that if you did read better books, or you did have more challenging conversations, that you could have built up your human capital. Let's say that instead of, you know, watching endless seasons of The Walking Dead, you actually read some economics, because economics in one lesson is approachable by anybody. Henry Hazlitt's famous book, anybody with an IQ of 90, 95 can get through that book, no problem. So what if you had actually learned something about the world? What if you had learned about politics? What if you had read some decent books, even if they're a slog, even if they're a challenge? What if you had worked out rather than sat on the couch? Would you be Bruce Jenner? No. But you wouldn't be a puddle muffin sitting on the couch, right? So I don't know the answer to this, but to what degree are people responsible for being lazy, for taking brain-rotting, stupid, slide-past shorts.
[46:33] Entertainment, right? Brain rot. They call it brain rot. As opposed to, I'm going to read challenging books. You know, how much IQ does it take to get through the fountainhead? Well, not massive. It's written in an accessible language. Will you get all of the arguments and ideas in it? Probably not, but it's going to be an interesting challenge. So have people, striven to improve their abilities, to sharpen their intellects, to learn how to think, learn how to reason. You can pick up a copy of Plato's Dialogues, and you can puzzle through it. I did it with my daughter when she was younger. And, you know, I have shows where I was going through the Communist Manifesto with my daughter, and we were talking about it and debating back and forth. So.
[47:19] Is it just that people can't, or is it that they have chosen to consume the laziest, easiest, you know, cheer at the sports ball while spilling Cheetos on your chest, as opposed to what people did in the past was they read challenging books and they puzzled through them. And there were lots of, have poor people ever joined a book club? There are book clubs everywhere. Have they tried to set up a book club and read through some challenging books? My books are all free. essential philosophy, can be read by anybody with an IQ 90, 95 above. So I don't know that it's as simple as, well, they just couldn't have, because in the past, and I wrote about this in my novel, Just Poor, that even in the distant countryside in England, there were these alchemical societies, scientific societies, biology societies, they got together and debated agricultural improvements and the biology of beetles and the nature of plants and the habits of bees. And they, they did all of this stuff. They weren't just sitting on a couch watching steroid idiots throw balls around. And so are people responsible for that? I don't know. Like if you've never exercised, are you somewhat responsible for health issues in your old age? Yeah. Yeah. I don't, I don't have much sympathy. I have sympathy for people who exercise and get ill, but if you don't exercise and you eat badly, then you get sick, why should I care? So have people taken care of their brains or have they let their brains fall into disuse? I think that's important.
[48:49] Yeah, I mean, I understand the idea that people should be responsible for, you know, all these sort of things like, you know, upgrading themselves, trying to get a better life, trying to get educated, trying to learn how to do things better in their life. Uh but i feel like what i was trying to talk about was this hypothetical where in the future uh there's going to be so many bots that and they're gonna be so much better than the average person at basically everything that there will just not be any jobs and everyone will be dependent below a certain like amount of money uh on the rich people that exist and they're not what i was trying to say is like they're not really trying to they're not trying to enslave but it's like because they exist, I guess they do in a way.
[49:35] Sorry, who's not trying to enslave?
[49:38] The poor people are not trying to enslave the rich, but they just exist. And they are, in my hypothetical scenario, they are incapable of competing with artificial intelligence driven bots.
[49:53] Why do you think there is such a drive for robots and AI? Why do you think that people want to invest in them so much?
[50:02] Because poor people are expensive and less good than supposedly the AI will be eventually.
[50:11] Right. So when it comes to the responsibilities of people as a whole for the society they live in, I view people as very responsible. So for instance, among the poor, there has been long a sympathy for and drive for unionization. Now, unionization is totally fine, but unionization that involves state power, where you simply cannot hire anyone else to break the strike, or unions where there's a monopoly, right? So unions were never supposed to exist in the public sector, in the government work, because unions were supposed to be a counter to the capitalist's need or desire to drive down wages, which is not a factor in a government. So when people say, oh, we got to have a union and we've got to drive up our labor costs, it's like all you're doing is accelerating your own replacement by being greedy. And again, in particular, if you go to government power, right? If you go to government power, should you, be sympathized with. If you go for a bunch of government power and it makes your industry, uncompetitive, you get extra money in the short term and then you hollow out the industry and destroy it in the long term.
[51:35] Should people have sympathy for you? I mean, these days in particular for the last 30 years or so, I mean, all the knowledge in the known universe has been available to everyone all the time, right? Even before phones, you could just boot up a computer. I had a 48 board, 4,800 board modem on a notebook. I had a 386 SX with two, count them, two megs of RAM. And there were CD-ROMs you could order for 20 bucks that had all the great works of history and philosophy on it. And when I was a kid, there were lots of families, even poor families, and we were one of them, who had encyclopedias. And you'd look up stuff and you'd read stuff and you debate stuff and so on. So again, I don't have the answer.
[52:19] But are poor people responsible for voting to take away the money of wealthier people, even when the poor people weren't in danger of being replaced? In other words, if you exploit a group like the wealthy, and I know it seems kind of odd, like exploiting the wealthy, but it's a very real thing. Wealthy people are taxed. You know, it's like what Robin Williams said about alimony. It's like ripping your wallet out through your dick, right? So wealthy people are taxed like unbelievably. And now they're talking about crazy inheritance taxes. They're always raising the capital gains tax and so on, which hits the wealthy people. So are, and I'm curious about this because everyone told me when I was growing up that, you're responsible for what your government does because you get to vote and therefore you have an influence and the poor overwhelmingly vote to take away the property of the rich, which is to enslave them. And this has accelerated wealthy people's desire to not have to deal with the poor. It has accelerated robotics. It has accelerated AI because the poor people are constantly voting to take away the profits, savings, income, and wealth of the rich. So do poor people have a responsibility for having preyed upon rich people for almost a century.
[53:49] I mean, I suppose you could say yes, but, uh, yeah, it's, uh, I don't know. I, I suppose people don't really think about, uh, you know, predating upon rich people like that.
[54:00] Sure they do. Come on, of course they do. Well, politicians are always saying, we're going to make the rich pay their fair share. The rich are exploiting you. The rich have way more than you. Of course they know that they're taking money from the rich.
[54:11] Some. Sure.
[54:13] No, I can't, I can't do it. Like, I can't have people say to me my whole childhood, that it's a democracy. You're responsible. You've got to know. You've got to be informed. Well, we're responsible. We control what the government does through democracy. And then I can't have people say, well, I'm not saying you're saying that, but I can't have people as a whole saying, well, they had no idea what they were doing. Like, of course they did. When people said we've got to tax the rich and give to you, everybody's greedy.
[54:40] Okay. Well, to that, I would say, I think most people are distracted and they don't really think about what they could do with themselves.
[54:49] What? No, no, no. We're talking about all the politicians who say to the poor, I'm going to tax the rich and the unborn, and I'm going to give money to you for free, which they do on a regular continual basis. The poor know all about that. Because if you'd said to the poor, listen, I'm not taxing the rich. I'm not taxing the rich because that's going to hurt you in the long run. So if you tax the rich, there's fewer jobs created, which means you get more dependent on a government. Oh, what about all the poor people who get government jobs and then spend their lives interfering with the free flow of capital and labor for everyone else with endless regulations, right? Enforcing those.
[55:28] So I can't go to the poor and say, gee, consistently, you have voted to take away money from the rich, but you had no idea what you're doing.
[55:41] Yeah, I would say I think most people stop at, you know, free stuff and safety guarantees. They don't really think beyond that, I think. The average person doesn't anyway.
[55:53] What do you mean they don't really think beyond that? Do you, when somebody says, I'm going to tax the rich, make them pay their fair share and give the money to you, you think people don't know what the hell's going on?
[56:03] Well, those people would because they're saying those things.
[56:06] But that's what the politicians say. That's what politicians are. We can tax the rich, make the rich pay their fair share. The witch is so much wealthier than you. We're going to soak them and give you. I mean, that's what's been going on my whole life. I mean, you know that, right? I mean, that's what I was talking about. Tax the rich and we're going to give the money to the poor and the poor keep voting for that. And what I'm saying is that that's immoral. And for me, it's tough to have a lot of sympathy. I mean, for the kids, sure, blah, blah, blah, right? But it's kind of tough for me to have a lot of sympathy for people who keep preying on others.
[56:45] Yeah, well, I mean, when things get to the point where it's like they can't afford things anymore, and so they have to be dependent, because they couldn't survive unless they just went into the wilderness and survived for free, then it's like, I mean, they could put more responsibility on themselves and try to make it, you know, anyways, but, you know.
[57:06] I don't know what you mean. Are you saying that the people who are poor, have no choice but to use the government to take money from the wealthy or they're just going to die? That's it. That's all.
[57:20] At a certain point, this is, I think, a reality at some point.
[57:26] Okay. At some point, it's not an argument. I don't know what you're talking about. Okay. Let me ask you this.
[57:30] Yeah. Currently, it's not.
[57:31] Hang on. Hang on. Let's go back. Let's go back 20 or 30 years, right? No, let's go back to the 1960s when the economy was good and strong and, you know, old economy, Steve, and all of that. So what percentage of the poor do you think had to take money from the wealthy through the welfare state, through the power of government? What percentage of the poor do you think had to do that or they would have died?
[57:56] Almost none.
[57:57] Okay. Well, that's when it started, right?
[58:00] Yeah. Maybe the disabled. Yeah.
[58:03] No, but the disabled.
[58:05] I mean, without their family or social safety net, like if they're alone and disabled, you know.
[58:11] Okay, let me ask you this.
[58:12] They can only bag stuff at Walmart.
[58:13] Have you ever been, sorry to interrupt, have you ever been part of a church?
[58:18] As a child, but not as an adult.
[58:21] Okay. So if you were part of a church and you were very generous and involved in the church community and so on and helped other people out, if for some reason you fell upon hard times, what would happen?
[58:37] I suppose everybody there would be able to lend a helping hand and you would go to them.
[58:42] Right. So people would, the church would reach out, they would pay your bills, they would cook your meals, they would help out as much as humanly possible because we all recognize that there are challenges in life, there's unknowns, right? So we all understand that. And so you either take out insurance or you save up a bunch of money or you invest in your community in some fashion, right? In some fashion. But what about people? They don't buy insurance. They don't invest squat in their community. They just sit home and do their own thing. They don't save any money. How much help should we give people who make bad decisions? Now, I don't know the answer to that. You don't know the answer to that. Nobody does because it changes. It's each individual situation. But what I'm bothered by, sorry to interrupt, I'll shut up in a second. What I'm bothered by is people who don't lift a finger in their community, don't help out others, don't go visit the sick and the old, don't cook any meals for anyone, don't watch anybody else's kids. They do nothing. And then they want all of these resources from society when they mess up. They don't save their money, they don't get insurance, they don't invest in their community. Or they don't have kids. And if you don't have kids, don't invest in your community, don't save any money, don't have any insurance, and then fall upon hard times, it's like, you know, sometimes people just exist as a warning to others what not to do.
[1:00:07] Yeah, I would say bare minimum, people should be able to eat and have a place to sleep that isn't going to get them killed. That's what I would say is bare minimum.
[1:00:18] Yeah, but I'm sorry to be annoying. That's childish. What does that mean? Are you saying that you provide that? Hang on. Are you saying that you provide that to people at the moment? Like, what do you do at the moment to make that happen? Hang on. Let me finish my thoughts. Let me finish my thought. Do you do that in the moment? Do you have people live at your house? Do you give all of the money that you don't absolutely need for your bare minimum to make sure that other people have resources that you say they need?
[1:00:51] No, but that's my own failing, I suppose.
[1:00:54] But then why would anyone listen to you about something that you say is a good that you don't even do yourself? That's what I mean when I say it's childish. It's a wish. I would like unicorns to deliver free insulin to all the sick people in the world. It's a wish. You need to actually do these things. Like, I, for one, look, I'm obviously not perfect, right? But I was like, I really want the world to become more philosophical. So did I try and lobby the government for more philosophy lessons to the masses? No, I just go start a philosophy show, right? Does this sort of make sense?
[1:01:29] Yeah, I would say those things aren't within my capability at the moment, but that's what I would like everyone to have.
[1:01:35] Sorry, what do you mean they're not within your capability? I thought you said that it was a failing, but if it's not within your capability, the story keeps changing, right? So if it's not within your capability, how can it be a failing?
[1:01:46] Currently, it's not. I could get there eventually.
[1:01:49] No, no, but you understand the contradiction. If you say, it's my failing, and then you say, it's not within my capability, I mean, I don't consider it a failure of me that I can't give birth. Sorry, go ahead.
[1:01:59] Sure. I meant to say my current capability. Like, I cannot pay for a house to house someone else.
[1:02:07] No, no, I didn't say that. Hang on, I didn't say that. Okay, where do you live? In what sort of abode do you live?
[1:02:15] I live with my parents. I'm in a suburb.
[1:02:18] Okay. So, do your parents have any extra rooms?
[1:02:22] They have one.
[1:02:24] Okay. So, they could take in someone, right?
[1:02:27] Yes.
[1:02:28] Okay. Have you talked to them about the need to take in someone?
[1:02:33] No.
[1:02:33] Why not?
[1:02:34] Well, actually, we did. Well, we talked about it, but the concern was, what if this person is dangerous? Yes.
[1:02:44] Well, but that's a challenge that everyone faces. So if you say everyone should have a food and shelter and so on, well, everyone could be dangerous. Let me ask you this. Do you think that somebody who is a pedophile, who is kicked out of his home, do you think that the pedophile should also get food and shelter for free?
[1:03:13] I mean, let's go with yes, just because they're a human being. Although they're not exactly the greatest kind, they're pretty awful.
[1:03:24] Okay, what about somebody who is both a serial killer and a mass rapist?
[1:03:32] Okay, at that point, they should probably be put in jail. And then the same would be given to them through a jail cell in a prison system that's publicly funded.
[1:03:41] Okay, but giving people free food and shelter is quite a bit different from putting them in jail. Do you agree?
[1:03:50] Say that again?
[1:03:52] Giving people free food and shelter is different from putting them in jail, right?
[1:03:58] Yes, but that person you said was a serial killer and a mouse something else.
[1:04:02] Right. Okay, let me ask you this. Let me ask you this. Let's say you have a sister and your sister gets raped by a guy. Should she be forced to give the rapist money?
[1:04:17] No, obviously not.
[1:04:19] Okay. So that's my point. Not everyone deserves this stuff.
[1:04:26] Well, I meant, I was imagining in a direct way, but if you mean to say like the kind of socialism that I'm kind of saying that I want, And if you're saying that's forced money being given to a rapist, then yeah, I suppose the answer is yes, but it sounds bad to just say yes.
[1:04:49] Okay. So with your family, and I appreciate the conversation, so with your family, they wanted to help someone, but they weren't sure if that person would be dangerous, right?
[1:04:59] Yeah.
[1:05:00] Okay. So in other words, for your family, someone who might be dangerous does not deserve that kind of charity. That's my point.
[1:05:12] At least not from us. If it endangers others, then no.
[1:05:19] Okay. So, for your family, it is too risky to help a poor person because that poor person might be dangerous.
[1:05:30] Yes, in the way that they have access to everything we do within the same home.
[1:05:35] Okay. So, that says that not everyone deserves your immediate help and charity. Now, when it comes to figuring out who deserves charity and who doesn't, do you think that individuals who know the person are better at judging or some distant government bureaucrat who faces no positive rewards for success or negative consequences for failure?
[1:06:04] Uh sorry i didn't quite follow that so.
[1:06:08] When it comes to charity do you think that people who know the person who needs charity personally are better at figuring out what they need or some bureaucrat who's never met them.
[1:06:25] I mean, obviously, somebody close to the person would know them better. But as far as the kind of generic safety and security needs that an average person has, or really that every person has, I feel like those things can be provided for, like, generically, because you don't have to get specific. But yeah, somebody who knew them, who, you know, they could take care of them better if they cared to.
[1:06:52] Okay. So let me ask you this. What happens if I disagree with you? So I prefer private charity. I think private charity is much better for people. And I say this, you know, to be personal and honest about it, since I always want to be clear about my sort of history. So my mother has been on welfare for a long time. And because she's on welfare, I can't help her. Because what I suggest, she doesn't have to listen to. Now, if I was paying her bills, I would have some influence and I would say, listen, you need to do this, you need to do that for your life to be better. Now, the government as a whole, the bureaucrats, they don't know her, they don't particularly care, right? And so me being able to help my mother has been removed from my life. I cannot help her.
[1:07:44] Because she gets all of her money from the government and therefore she doesn't need to listen to me. I mean, obviously I've tried to help her without the money thing, but if I had the money thing, I would have more influence over having her or helping her to make better decisions, right? So I believe that, in fact, I know that socialism involves force. So you have to take people's money. The government has to take people's money by force. And I think that taking money by force is immoral. doesn't matter who does it, doesn't matter how it's done, taking money by force is immoral. It's stealing, it's violence, right? So hang on, hang on, let me finish my point and then I'll give you the floor. Hang on, let me finish my point, I'll give you the floor.
[1:08:30] So if you think that there's a central bureaucracy that really helps the poor, and I believe it's church, community, family that helps the poor in your system, am I allowed to disagree? Like if you want, if you think Bob over there is the best person to help the poor and you want to give a lot of money to Bob, fine. I would rather give my money to private charities. I would rather encourage people to be part of their communities and so on. So in your system, am I allowed to disagree with you? In my system, you're allowed to disagree with me. If you want, if you think some institution is better at helping the poor, you can go fund them. But we would be, we would be allowed to disagree with each other in my system and a free market system. So in your socialist system, am I allowed to disagree on how you think the poor should be helped and do it the way that I think they should be helped?
[1:09:20] Um, okay. Well, let's just say you were loud and vocal enough and you got your concern to the government and the government gave you the option to find out.
[1:09:29] No, no, no, no, no, no. See, in my system, hang on, no, it doesn't answer the question because in my system, you don't have to do any of that. You can just go donate to Bob if you think he's so great at helping the poor. You don't have to do any of that. Right? So in a system of voluntary marriage, you can get married to and divorced to whoever you want. And it wouldn't be a good system to say, well, the government says who you get married to, but if you can petition the government and you can get a big enough set of signatures and you can run for office and you can win all of that, maybe you can change who you get married to, that wouldn't be freedom, right? So in your system, if you think Bob is really good at helping the poor, you can give all the money you want to Bob, whereas I can help the poor in other ways, say by having philosophical conversations on a regular basis. So in your system, in the socialist system, you're allowed to disagree with me, no effort, no problem, no threats, am I allowed to disagree with you in your system, or am I forced to do what you want to help the poor?
[1:10:29] If you want to say that I am the personified government, then sure, yes. You have no choice.
[1:10:36] Okay. Does it trouble you at all that you're forcing other people to do what you think is the good?
[1:10:50] I would say, uh, does it trouble me? I would say no, because of what it brings for everyone.
[1:11:00] No, I'm sorry, man. Your soul is lost. I hate to be dramatic, but this was a fork in your life, honestly, because you're not going to have these kinds of conversations on a regular basis. If it doesn't trouble you at all, that you're pointing guns at people and forcing them to do what you think is good when they're just peacefully going about their lives and just disagreeing with you, that's a little psycho, honest. And that it doesn't trouble you at all. That you were willing to use the power and force of the state to compel other people to do what you think is the good. That that doesn't trouble you at all?
[1:11:33] Well, in this case, for these reasons, no. Because it's good for everyone.
[1:11:38] Okay, well, if you're an advocate for violence, I don't want to fucking talk to you. Not even a little bit. Not even a little bit. If you advocate violence against me, Fuck off. All right, let's go with. Let us turn to the light. Yeah, I'm happy to have conversations with people about ideas, but the moment that they want to use force against me for disagreeing with them, I'm not even going to pretend it's a debate anymore. And it's really sad. You know, you always hope. You always hope that somebody's going to be like, oh, you know what? I hadn't really thought of it that way. That is a little troubling, because you hope that they're going to have some, compassion, humanity, empathy, virtue. You know, if somebody would have proved to me that my system, the system that I want, the system that I advocate for, requires violence against the innocent, I'd be like, ooh, that's not good. I don't know what the answer is, but that can't be quite right, as opposed to, yeah, no, it's fine. I'm fine with that. I have no problem with it whatsoever. No thanks. All right, light, if you wanted to unmute, I'm happy to hear.
[1:12:43] Yeah, I was going to ask you something else, but, you know, kind of hearing about the homeless situation, I looked into that, you know, I was thinking of letting some homeless people nearby live, who are living in tents, just live in my backyard, but you're not allowed to do that. If you also bring someone into your home, you can get into like a tenant situation. And so, you know, best case scenario, they overstay their welcome, and you have to go bring them to court or, you know, and get tied up in the legal system.
[1:13:16] Yeah, that they can end up on the lease. And the other thing too, of course, and there have been people who've been prosecuted simply for giving food to the poor or to homeless people.
[1:13:26] Yeah, so it's an example of how the state, you know, impedes charity in that regard. I wanted to ask you though, you know, kind of what we see with the introduction of technology, like this is, when we saw it in the 20th century, the industrialization of society also meant the industrialization of war. And it kind of raises the IQ I guess floor of society. So you need a higher IQ to be useful and combined with the rich trying to protect themselves from the poor in a revolt. Do you think another war is inevitable.
[1:14:12] No, I think that the age of war, as we understood it, has kind of passed us because of, I mean, a variety of reasons, but mostly because of weapons of mass destruction and also because of things like EMP bombs where you can just shut down an entire society by destroying the electrical grid for a certain amount of time. So I think that the age of sort of traditional world wars is in the past, which is why we're now in fifth generation warfare, which is to do with propaganda and other sort of forms of harming people and the government allowing particular groups to get lots of jobs and denying jobs to other groups and so on, right? So that this is all propaganda and information warfare that is going on. Somebody says, aren't EMP bombs strictly theoretical? I don't know.
[1:15:02] Somebody says, yes, okay, well, but there are viruses that can disable the electrical grid or other things that could be used to sabotage the electrical grid, or you can put a bunch of poison in the water supply in cities. It's very hard, or sort of dirty bombs and so on in a suitcase. It's very hard to protect yourself, particularly some kind of open society. It's very hard to protect yourself in the modern world. And so I don't think we're going to have the same war Because that's what happened in the past, right? As you know, like what do governments do when they run out of money? They just go to war. Because you can't get people to accept privation in a time of peace, but you can get them to have coupon books and ration books and all of that in a time of war. And so when governments, everybody who votes for government deficits is voting for war. Everybody who votes for central banking is voting for war because government deficits, especially unfunded liabilities, can't be paid. And when governments can't pay their bills, they turn to chaos, right? They turn to absolute chaos. And in this case, it's a lot of other factors, but I don't think it's going to be direct war. I think that would have happened already, if that makes sense.
[1:16:11] Yeah, but I can't help but think that, you know, I manage some people, you know, and my opinion of AI is that it kind of, there's a lot of mental offloading going on. And I find that since I started using it in the last year or so, I don't retain things as well when I hear it from a chatbot versus when I, you know, read it from a forum post or listen to a podcast or something like that. I don't know what it is exactly, but I personally don't feel like the learning experience is there. But what I do notice, having to manage other people, is that the quality of work it gives is better than the average person who is, like you say, they go home, they just do scroll, they don't really expand their knowledge or anything like that. And so I think it's making those kinds of people very replaceable. And I think, you know, on the C-suite level, you know, they're kind of buying into the whole AI hype thing. and governments, elitists, technocrats.
[1:17:19] I can't help but think that whether it's going to be another pandemic or something, but it will be problematic in their eyes to have so many poor people that are going to be of so little value or just needing handouts or something because we can't really find a use for them. So, you know, unless there's some way to raise the IQ or just the skill set of people where, they can offer value in spite of, you know, the AI that we're seeing.
[1:17:56] I feel like the elites are going to have, I just, I don't know, maybe I'm just being a, you know, tinfoil hat, but I just don't think that they're going to be, you know, just let's have, you know, billions of people that, you know, can be easily replaced by robots, you know, causing trouble.
[1:18:17] Yeah. And I don't know the answer to that. There is no good answer. Because if you say, well, we're going to take all the profits from AI and robots and we're just going to hand out UBI to the poor, then I don't know how that solves any particular problems in the long run. At the same time, if the poor are going hungry, they're going to rebel. And so it is a big problem. And the problem has a lot to do with the decisions made by prior generations and current voters. So prior generations maintained government schools. And that has dumbed down the population to the point where it's completely insane and ridiculous. See, government schools are great when you first get them because you get all these teachers who gained their skills in the free market and the private sector. It's like when you first get socialized medicine, you get all of the hardworking doctors who went into medicine because they enjoyed treating patients and being part of the free market and they don't just lose all of that motivation but then a generation or two down the road, you just get a bunch of lazy people who write pills rather than listen to patients, right?
[1:19:21] So I've been saying forever, the welfare state's a bad idea.
[1:19:23] I've been told, no, you're wrong. You don't care about the poor.
[1:19:26] You're terrible and blah, blah, blah. It's like, okay, well, now we have an embedded population that has no incentive to improve their skills and AI is coming. You know, it's the same thing with, you know, I've been saying government schools are bad. We should privatize. Oh, you don't care about education. You just want the poor to remain illiterate and blah, blah, blah. And it's like, well, no, now we've got a bunch of people who don't know how to think because they've been programmed into ideological brain rot by government schools. It's like you can only do so much in life to try and tell people that disaster is coming. You know, honestly, to me, it's like, hey, man, you know, when the sea goes back this far. It means a tsunami is coming. I'm telling you, man, you got to get off the beach, run up and down, get off the beach, get off the beach. A tsunami is coming. And people are like, you're crazy. I want to finish my game of volleyball. There's this hot girl in a butt floss bikini that I want to go and talk to. And, you know, I paid good money to get on this beach and I just put on my sunscreen. I forget about it. It's no problem. Right. So, and you're just telling people like, here's the physics. Here's why, here's what you need to do. Get ready, get ready, get ready. And a few people listen and get the hell off the beach. But at some point, like at some point, what do you do? You're running up and down the beach saying a tsunami is coming. At some point, you've just got to get the fuck off the beach and get your family to safety. You say, well, what about all the people who were left behind? What about, I tried.
[1:20:45] God knows. I'm sure you did. I'm sure everybody who's listening to this has tried to work their fucking ass off to tell people about the dangers that are coming.
[1:20:57] We have burned reputations. We have burned income. We have burned stress. We have burned frustration. We have burned anger. We have burned friendships. We've burned family relationships just to tell people, tsunami is coming, man. Get ready. Get prepared.
[1:21:18] Now, the important thing, I think if you just know that's coming, right? Oh, the water is going down. that means a tsunami is coming, right? And if you just like, you just flintstone off the beach, right? You just leave without telling anyone, I think you feel bad. I think you do. I think you feel bad because you should have tried to warn some people, shouldn't you? I think you should, I would have. I'm sure you would too, right? On the other hand, if you stay warning people who aren't listening and then you get taken down and killed by the tsunami, that's not good either. So there's a tipping point. You don't want to just, oh, tsunami's coming. I'm not going to tell anyone's going to pack up slowly and get out of here, right? That's kind of douchey, right? But at the same time, you don't want to be wrestling with people and trying to get them to recognize the tsunami when it towers over you and crushes your ass into the seashells. So there's a point at which honor and conscience are satisfied that you did as much as you could without getting fucked completely.
[1:22:24] And if AI is imminent, robots are imminent, well, of course, people will run to the government or ban AI, ban robots, and they'll do all this sort of shit. But China won't, Japan won't, other places won't. Japan is going to try and solve its immigration crisis, so its reproduction crisis with robots and AI or whatever, right? So the countries that try and stop it will just get overtaken by the other countries and there'll be a brain drain that way and all that kind of stuff. So they'll have to, you know, you can slow things down, but you can't stop. You can't stop progress, right? So I've been telling people, you got to think, and we need better schools, we need better education. Welfare state is terrible, traps people in a perpetual underclass. And I'm not happy that the majority of people are still on the beach playing this stupid volleyball game and playing Frisbee. When I've been screaming at them for decades, tsunami's coming. I'm not happy that they're still down there but I'm done trying to warn people I got to get to safety.
[1:23:30] So I think for me again I'm not saying this is some sort of universal prescription but for me it's like man I did my honorable work I warned everyone for decades and decades and decades, now get into bitcoin keep your skills up, improve your thinking, improve your philosophical reasoning. Get into good relationships. Get into a, how long have I been saying? Get into a community. Storm is coming. Winter is coming. Get into a community. Get friends around. Get people around you who care about you. I'm not staying on the beach and getting crushed. I'm not staying on the beach and drowning. I'm not. I'm just not going to do it. And I'm also not leaving the moment I saw. Does that help at all?
[1:24:23] Yeah. I mean, there's only so much you can do. And, you know, like at the end of the day, like I feel a lot of pity for a lot of people, right? Because I feel like a lot of people, they're just in a community and a community sort of just reinforces their behavior. And ultimately they reinforce it themselves.
[1:24:39] I mean, the reasons, at this point, the reasons are somewhat immaterial because we don't know. It's just that people had a choice. You know, it's like, you know, I could not say to people, don't take the vax because I'm not a doctor. Right. I couldn't say that. And I'd never did. I put forward some thoughts about it. I couldn't say to people, don't take the vax. Right. But all the people who were like stampeding to go and take the vex, and they were more than happy to strip away the rights of people who were hesitant about it, and to have them castigated, and to strip away their rights of travel, of freedom, of liberty. I mean, how much should I care if people have negative health outcomes as a result of stampeding to obey the government, which made my life incredibly difficult.
[1:25:40] One day I will tell the story of everything I did over the vaccine. Everything. And it was a lot. How much sympathy should I have? I got to tell you, I'm a ridiculously compassionate person. And I'm not saying this is a virtue. It's not a virtue. It's not a virtue. I'm a ridiculously compassionate person because I grew up seeing so much suffering among the poor. I'm a ridiculously compassionate person, I think I'm out. Like, I think I'm dry. I think I'm done. I think I'm getting off the beach. All right. We got lots more people who wanted to chat, but I really appreciate
[1:26:17] your question, your comments.
[1:26:18] You guys always seem to bring out the best of me, and I hope that it's helpful to you as well. All right. Let's go with, who've we got here? Uh blue lagoon i think we tried that before didn't we yes let us go with uh barcode did we talk before i think we might have if you wanted to unmute i would love to hear what you have to say, Testing, one, two, three. Can you hear me, me, me? Yeah, look for the end of the fatigue. Look for the end of compassion within you. I'm not saying you have to have it. I'm just saying that I think that's kind of where I am. Sorry, go ahead, my friend.
[1:27:00] Hi. So in the context of AI, I hear what you're saying, and I agree with a lot of it. Obviously, you're a brilliant man, and that's gotten you ahead in life, and you made a lot of money. And I hear where you're coming from, where you see people not applying themselves and not using their skills and not developing themselves. So from a patriarchal sense, you want to encourage this in the general population. And I think that's true of up to this point in time. I do think AI is such a disruptor, though. It's going to drown all of us eventually. Like, 80 IQ right now is functionally useless, right? They won't even hire them for the military because you can't trust them to dig a hole, right? So, you know, give AI a decade or two and it's going to be like 140 IQ people are going to be functionally useless. Give it another two decades, it's like 160 IQ people are going to be useless. And it's like, at that point, it's like there's no sense in this patriarchal, like, encouragement of people to work harder. And, like, develop their skills. Like, a genius can develop skills his entire life, and AI's still going to make him look like a moron and going to be able to do everything.
[1:28:18] Hang on, hang on, hang on. But AI can't think.
[1:28:23] Well, it's getting there though, isn't it?
[1:28:24] No, no. AI is a fundamentally different architecture because it's bites. It's on and off. It's ones and zeros. AI is a fundamentally different architecture from the human brain. The human brain is layered from the lizard brain all the way up. It's got multiple different sets of processing centers. It's got a lizard brain, the fight or flight, the hippocampus, the medulla, the neofrontal cortex. like the architecture of the human brain is holistic in that you have all of these new, and nobody knows how it works. I'm obviously not saying I know how it works. That's sort of the point. But the architecture of the human brain is wildly different from the architecture of AI. AI is a word guesser. It does not think, it does not reason, it doesn't know when it's wrong. It is why it hallucinates so often. So it is a processor. It is not soulful. It is not creative. It does not think. It does not reason.
[1:29:22] And so AI is going to automate everything an NPC does. This is why I keep telling people being an NPC is an evolutionary dead end these days. You have to learn how to think. You have to learn how to think. I will believe that AI is like the human brain, A, when it shares the architecture, which it doesn't, and B, when it gets insights from its dreams. AI does not sleep. It does not dream. It hallucinates, but that's a whole different matter. So AI does not think. AI simulates thinking in the same way that if you're watching a video of the Amazon, you're seeing a video of the Amazon. But if you're watching a Unity 5 recreation of the Amazon in a video game, you're not seeing the Amazon. It's just a simulation. It looks like the Amazon, but it's not the Amazon. It doesn't evolve. It doesn't have an ecosystem. It's just a bunch of pixels and computer code. And so AI does not think. It does not reason.
[1:30:23] And that's why it has to be fed. Like my daughter was learning language at the age of eight months. I remember her saying the word elbow very sort of clearly. And AI has to be fed with gigabytes upon gigabytes of human text to be able to guess what comes next in a sentence. It is not the same architecture. It doesn't evolve. it's not creative, it doesn't have inspiration, it doesn't have emotion, doesn't have passion, doesn't have fear, and it doesn't have all of the complexity of.
[1:30:57] Push-pull that comes with the human consciousness. Well, I want to work harder, but I also want to spend time with my family. Well, I really like this girl, but I'm afraid it might be just lust, or I really want this job, but I also want to do something of my own. And so the architecture and the entire processing of AI versus what the human mind does could not be more opposite. AI is a simulation at best of NPC. It is not something that thinks and creates. And I tried this just out of curiosity, I went to sort of a very advanced AI when I was writing my novel. I was planning writing my novel, my last novel. And I was like, give me, give me plots, give me plot ideas. Now I didn't end up using any of these plot ideas, but I said, you know, give me some plot ideas and it got everything wrong. Right. I said, so it said that the babies are born at this time. And then it said that they were 20, 30 years later, like it got everything wrong and didn't even notice it. And I would have to say, you got this wrong. You got that wrong. Oh, let me correct. let me double check this, that, and the other, right? And so it is like me copying kanji. I don't read kanji. I don't speak Japanese or whatever, right? But it's like me copying that stuff out.
[1:32:13] I don't know what I'm doing. I can recreate it, but I don't know what it is. And that's AI. It can copy human language, but it is not speaking human language. It's just like me copying our kanji and passing it back. Or this is, I think this is the guy who just died. He had this sort of thought experiment, like, and I'm roughly paraphrasing it here, but it's like, if you had a translation book from say kanji to aramaic or something like that right and somebody handed you something to translate you could just look it up and translate it but you wouldn't know what you're doing you're just looking something up and translating it by copying something but you would have no idea what any of these symbols mean but that's ai it's just creating this simulation of language and of thought but it's not doesn't understand language i mean you can't imagine AI looking at a picture of its mother when it was young and getting misty-eyed and emotional and sentimental. And that's part of our whole thinking and reasoning process, if that makes sense.
[1:33:18] That makes sense. I just, I don't know if I quite share the same optimism. I mean, they're coming at AI in a bunch of different ways, right? So like, there's like the chatbots that you're talking about, where it's just kind of like regurgitating what it's taken from the internet, and stuff like that. But it can generate images. Have you tried like the Sophie video making app? Like the stuff it comes out with, with these very basic prompts, like, it comes out with creative videos that, you know, people dedicate their life to being able to do 3D animation.
[1:33:45] And no no it's not creative it's all it is doing is looking for pixel patterns
[1:33:49] from prior videos it is not creating it is not thinking right.
[1:33:54] It's not going to think in the same way that we do but i don't know i'm i'm like like bf skinner behaviorism like that kind of thing like inputs and outputs like.
[1:34:02] No so if you're sorry if you're an npc sure i'm not saying you but if someone is an npc and what i mean by that is it's just input and output right so uh let's see they hear about IQ and racism, right? They're just input-output, right? It's prompt response, right? Then if they are like an NPC, then, oh, absolutely, AI could do that. Sure. But then that's not because the AI is thinking, it's because the human being is not.
[1:34:29] Do you ever like meta-think about your own thinking though? Like whenever you're problem solving, don't you have like if-then statements? Like if I encounter this sort of problem, this is like my route toward problem solving it. And if that doesn't work, then I might need to alter my route. But it's like, there's, there's always sort of nested if statements that my brain sort of operates through. And some of those are emotional, and some of those are like logical. But at the end of the day, like, everything I do sort of has like some sort of input that works its way through my multi-layered brain.
[1:35:02] Okay, so what's the most creative idea? And I'm not challenging you, I'm just genuinely curious. What if you had to say your most creative and original idea, what would it be?
[1:35:14] So, I mean, I thought long and hard about, like, sort of, like, where would God come from, from an atheist perspective? Like, why, like, if there's a big bang, like, how does our universe exist at all? And then I sort of reasoned through it and said, like, all right, like, if you take the atheist perspective that, like, universes are winking in and out of existences permanently, like, all the time, then...
[1:35:43] I'm sorry, why didn't I... Sorry, I've never heard that atheist perspective. What does that mean? Universes are winking in and out of existence all the time?
[1:35:50] So the atheists take on where do we come from? Instead of just the Big Bang or whatever, they say, oh, well, there's no reason we shouldn't exist. If there's an absence of a universe, then any set of rules could emerge from nothing. And then whatever set of rules lasted would be us because we're here.
[1:36:13] I mean, I've been around the atheist community for 45 years. Maybe it's just my limitation, but I think that the atheists generally go to two ideas. One is that there is a sort of big bang, and we don't know what the source of that is, but we're not going to call it God because that's not an answer. And the other is that based upon the law of the conservation of matter and energy, that matter and energy have always existed. And it's just perpetual. There is no start, there is no beginning, there is no origin.
[1:36:40] Right, right. So if you get like more bleak about it, it's like, well, why would there be a law of conservation of energy? Why would there be any law at all? And the law of conservation of energy...
[1:36:51] That's just an irreducible is, right? Atoms can, like, matter can be converted into energy, energy can be converted into matter, you know, the E equals MC squared stuff, but that's just an irreducible is. Saying Y is, is not a valid thing. It just is. There is no cause.
[1:37:12] Well, that's sort of the, I'm sort of in agreement, right? Like, as far as we're aware, like, our universe is perpetual because we have this law of conservation of energy.
[1:37:22] Sorry, but where's the winking in and out, the universe is winking in and out of existence? But again, I'm sorry if I'm not familiar.
[1:37:27] Our universe is like, our universe is like a perpetually self-sustaining thing. It's like an Ouroboros that, like, self-generates, right? It came from itself and it continues to, like, replicate.
[1:37:36] No, no, no, if something is perpetual, it doesn't come from itself, it just is.
[1:37:41] Well yeah exactly so like i mean because we we exist in like a 3d like environment right we're we're not fourth dimensional beings but.
[1:37:51] Sorry like i'm sorry you keep using these terms like people know what you you know you're talking to a general audience here right so by fourth dimension hang on hang on hang on slow down slow your roll slow your roll i gotta i gotta if i don't know what you're talking about there's no point you continuing right, yes like if i had to say well you agree about flibber to gibbets right you'd say i don't know what that means, right? Like, I have to know what you're talking about. So by fourth dimension, do you mean time?
[1:38:13] Yes. So like God is like, at least in the time dimension, right? Like he knows the beginning and the end, and it's all one picture.
[1:38:23] Well, hang on. Are you saying that God is subject to time? God exists within time and not outside of time?
[1:38:31] God is like above time. He's like all of it, all at once. He's everywhere all at once.
[1:38:37] Okay.
[1:38:38] Okay.
[1:38:39] You know, that's just a description. That's not a proof or definition, right?
[1:38:43] Right, right. Yes.
[1:38:44] Okay.
[1:38:45] Sorry. This is, again, my most creative idea, so there's a lot that went into it. But basically, if you're a fourth dimensional being, you would basically have...
[1:38:57] So fourth dimensional means subject to the laws of time?
[1:39:00] Fourth dimensional means you can move through time.
[1:39:03] It's not that we can move through time. We always do move through time.
[1:39:09] Yeah, but we can only move one way because we're three dimensional beings. Whereas a fourth dimensional being would see its like baby form and its old age death form all at once it would look like a long snake of like a being that just starts and ends all at once so.
[1:39:26] You would collapse time into a singularity i'm sort of trying to follow this.
[1:39:29] Yeah yeah so all all of the universe from the big bang to the inevitable heat death all of it is one thing all at once to a fourth dimensional being.
[1:39:40] But you're simply taking the universe and removing time as a factor and removing cause and effect and sequence, right?
[1:39:46] Well, yeah, because cause and effect is, in this picture, is basically deterministic. It's like, this is the beginning and the end, and I can see all of it because I'm all of it all at once.
[1:39:58] No, no, hang on, but you're simply taking a characteristic of the universe called sequence, time, cause and effect, and you're removing it.
[1:40:08] Yeah. I mean, yeah. So like, you know.
[1:40:11] It's not creative to take away something. It's like saying, well, I've got a plane, but it's got no wings. And it's like, that's not creative. You've just taken the definition of a plane and taken away the wings. Right. And so you've just got the definition of the universe, which includes time and sequence and cause and effect. And you've said, well, I'm taking away time, sequence, cause and effect, but it's not creative. I'm not saying this, but it's not creative. You're just simply taking a characteristic away that is embedded in the universe and say, well, what if there was no such thing as time, but taking away the characteristic of something is not creative?
[1:40:42] Well, I mean, I didn't come up with this idea, right? This is like, you know, a single dimensional being is like a point.
[1:40:49] Okay. I need you to get to your creative idea. I'm asking for you. I can give you my creative idea in like 30 seconds, but that, but I'm a professional, blah, blah, blah, whatever. Right. So just tell me your most creative idea in something that doesn't take half an hour of me asking, what the hell are you talking about?
[1:41:06] So then the universe is one living entity that is God, and we're all just sort of a part of it, and it's self-sustaining.
[1:41:13] But that's not your idea. That's just animism.
[1:41:16] It's called animism?
[1:41:18] No, animism is the idea that everything is alive.
[1:41:21] Okay. I mean, I kind of agree. We're all part of this God, and the God is everything, all of everything.
[1:41:27] Okay.
[1:41:27] So then that's not what AI is.
[1:41:29] Hang on. So the way that you know if something's a creative idea, is you look up and see if anyone has come up with it before.
[1:41:36] Sure.
[1:41:38] So have you looked up and see, has anyone in the past ever said that God is everything and everything is alive?
[1:41:46] Yeah, yeah. Animism describes part of it.
[1:41:49] Then it's not an original idea. I was asking for your most original idea.
[1:41:52] I mean, the idea is basically that AI is going to get us there. Like, all of matter is eventually going to become a one single thinking thing. Inevitably. Like, even if we don't do it, maybe the Chinese do it, maybe aliens do it.
[1:42:07] Hang on, hang on. Slow down. What the hell are you talking about? All of matter is going to become part of AI and a thinking being?
[1:42:15] Yes. All energy in matter is really the same thing. It's just energy moving at different speeds. Right now, we're converting matter into thinking entities. And like you said, it's not perfect yet. It's like kind of autistic.
[1:42:27] No, but AI doesn't think. AI does not think.
[1:42:30] Not yet. Not yet, but it's getting smarter every day.
[1:42:33] No, no, it doesn't think. Making it faster, like having it not think faster doesn't make it think. It'll be better at simulating, thinking video games are more realistic now than when I was a kid. But that doesn't mean that they're the real world. They're just better at simulating the real world. And so if AI doesn't think, making AI faster makes it not think faster.
[1:42:53] I mean, it's better at driving. It's better at creating some art. No, none of that translates. You don't think any of that's a thought? Driving better than we do on average?
[1:43:05] But driving is just input-output. That's not thinking.
[1:43:07] Well, I mean, higher IQ people are better at driving than lower IQ people. Per capita accidents, per mile driven.
[1:43:15] I understand that. AI is better at some things than human beings, without a doubt. Right. You know, the internet is better at, quote, remembering things than human beings are. It doesn't mean that it thinks.
[1:43:28] I guess I don't understand the difference.
[1:43:33] Well, look, I mean, and I hate to sort of pull rank on you here, but I've been programming computers since I was 11 years old, and I'm currently pushing 60. So that's almost half a century. And I spent years as a chief technical officer and a lead programmer in some very highly creative simulation and data-driven applications. I'm telling you, and you don't have to believe me, because what does half a century of experience fundamentally mean? It doesn't mean I'm right, but it means I have half a century of experience. Computers don't think. They don't think. They don't think. Now, the other thing too is if you're going to say all of matter is going to be part of one computer or one thinking entity, how do you deal with the problem that you can't find ways to transmit information faster than the speed of light? And you've got 100 billion stars and 100 billion different galaxies with 14 billion light years times two all over the place or whatever. So how are you going to get these disparate entities to all communicate with each other, which would be necessary for a computer if it's going to take billions of years for information to travel from one area to another.
[1:44:40] I mean, that's something that's probably well beyond human scope. I think, you know, that sounds like magic to us. That sounds utterly impossible. But, you know, at the same time, going to the moon was impossible for, you know, two generations back. And we solved that problem.
[1:44:53] Sorry, hang on. What do you mean going to the moon was impossible?
[1:44:57] Like, you know, I don't know. Take like an ancient tribe. They look up at the moon. They don't think that's a thing they could ever touch. or land on much less, right? And in the same way you're saying.
[1:45:06] Well, hang on, hang on, hang on, hang on. No, this is a logical fallacy, which is to say, because people couldn't understand things in the past, everything is possible in the future. So let me ask you this. Do you think it's possible for a human being to be in two places at the same time?
[1:45:24] I don't know if.
[1:45:27] Come on, this shouldn't be tough. You know, we have no court system, if that's true, because then there's no such thing as an alibi, right? Is it possible? Well, let me ask you this. Have you ever been in two places at the same time?
[1:45:38] Well, my voice and my thoughts are in the room with you right now.
[1:45:41] No, but we know the technology of that is transmitting sound waves over TCP IP cables. So have you personally ever been in two places at the same time?
[1:45:53] It's a bit reductionist, isn't it, to say no? because parts of me are elsewhere using this technology.
[1:45:59] Let me ask you this. Does two and two make four?
[1:46:02] Yes.
[1:46:02] Okay, good. So we've got something that, is it not reductionist to say that two and two make four, but two and two make four?
[1:46:08] Okay. Yes.
[1:46:10] Socrates, all men are mortal. Socrates is a man, therefore Socrates is mortal. Is that universally true everywhere, no matter what?
[1:46:17] Sure.
[1:46:18] Okay, good. Is it possible for matter, the same chunk of matter, to exist in two places at the same time. Can you be in Australia and Austria at the same time?
[1:46:29] No.
[1:46:30] Okay, good. So we can't be in two places at the same time, right?
[1:46:36] But we're closer to that than we were ever before. And it's more of me is able to be in additional places at the same time because of technological advance.
[1:46:46] No, you're not traveling. You are not in different places. Your voice is being transmitted. That's not you being in two different places. Come on.
[1:46:53] I mean, it's not.
[1:46:55] It's not. It's not. I am currently broadcasting to however many people, right? I'm not in every one of their homes. I'm sitting here in my studio. Let's not go nuts here, right?
[1:47:06] Okay. I mean, yeah, then we come down to like, what are you? Like, I don't know. It gets, it's not convoluted.
[1:47:14] What do you mean? What am I? Why is that challenging?
[1:47:19] I'm looking at like an image of your face. I'm talking and interacting with your mind and your voice. Like that's like, I like to think of myself as mostly my mind and like the body is like part of it, but it's not essential to my experience with most other humans.
[1:47:36] Okay. So did people who were primitive tribes, did they believe things that were false?
[1:47:45] Absolutely.
[1:47:46] Okay, great. Now, if we believe things that are true, does it also mean that those things can be false in the future? In other words, if we believe that two and two make four, could two and two not make four tomorrow?
[1:48:07] I don't know. What do you think? What are you driving at?
[1:48:12] I'm sorry. I'm just asking you a question. I mean, you don't have to answer. I'm just trying to ask.
[1:48:16] To not make four in the future? I mean, we can believe things now that seem certain.
[1:48:23] Okay. That's not what I'm asking. I'm not asking that. Okay. Let me ask you this. Can a square circle exist?
[1:48:31] No.
[1:48:32] Okay. Is there any time, past, present, and future, when a square circle could exist in our universe?
[1:48:40] I don't know. Probably not. As far as I'm aware, no.
[1:48:45] So what you're saying is that self-contradictory entities could exist?
[1:48:51] Yeah. I mean, the quantum stuff is really weird. It blows my mind a lot of time. I'm like, that makes no sense. That's not a thing.
[1:48:58] Yeah, but quantum flux, quantum issues resolve themselves long before the realm of philosophy takes over. The realm of philosophy is to do with sense data and particularly to do with virtue. Okay, let me ask you this. Is rape immoral?
[1:49:17] Depends. That's horrible to say.
[1:49:20] No, tell me your case.
[1:49:23] I don't know. Imagine there's like a horrible people that, you know, eat everyone alive and technologically regress everyone and everyone's lives are horrible in this tribe. Would it be better to exterminate this race or propagate with them by force such that they continue some kind of existence within your better culture and tribe?
[1:49:47] So you're saying that rape can be virtuous?
[1:49:50] I'm saying it can sometimes be the lesser of two evils I suppose yeah, I wouldn't call it virtuous, right?
[1:50:02] Well, I'm okay. So what about murder? Is murder, and I'm not talking about self-defense, like just going up because you dislike someone and killing them. Is that immoral?
[1:50:17] Typically. I mean, yeah, just killing somebody because you don't like them, yes. But there can be justifications for killing.
[1:50:23] No, no, I said murder. Hang on, don't reframe it. I said murder. Not murder is willful and immoral it is not killing killing you can kill someone in self-defense and that's moral too right if they're attacking you okay what if they're attacking somebody.
[1:50:38] Else can you kill somebody for that.
[1:50:40] Yeah defense is a third party right defense is a universal right so third parties can can achieve it okay so murder is evil right we agree with that yeah okay is there ever a time, past, present, or future, not what people believe, but what is. Was murder ever, let's say going forward, is murder ever good?
[1:51:05] I'm going to say absolutely not, because you defined it as evil in itself, right? You're defining it as killing somebody not for any positive benefit.
[1:51:16] That's not my definition of murder.
[1:51:19] Okay, yeah.
[1:51:20] No, you could kill someone for benefit. You could kill them and take their wallet. It's the unlawful killing of another human being.
[1:51:30] I mean, if the law is, the law can be wrong sometimes though too, right? Like legal moral frameworks aren't always true morality either. Like I believe there is an objective morality and a point to the universe. But like sometimes the law gets that wrong.
[1:51:45] I mean, that's an obvious statement, right? We're intelligent people. Nobody's ever going to say that all laws are all right because laws contradict each other and change all the time. So that's not a particularly valuable addition. Okay, so you're not sure that a square circle can never exist, but you're sure that murder is always wrong.
[1:52:04] Yes.
[1:52:04] And why are you certain that murder is always wrong?
[1:52:08] Because we're defining murder as like for selfish gain, right? So like if I look at like what is moral and immoral, it's like moral things benefit, like the universe over the self and immoral things benefit the self at the cost of everyone else or the universe. And sort of like different cultures have different frameworks for what is moral and immoral and different reasoning behind it. But like, that's kind of the general rule is like the, what benefits the tribe is good and what benefits the self at the cost of the tribe is bad. And those are the like, those are the tribes that survived and thrived, right? The ones that got closer to an objective truth about what is moral and what is immoral.
[1:52:59] So you're saying that human groups tend to do better when they follow objective morality?
[1:53:05] Yes.
[1:53:06] Okay. Would you say that something like Islam is succeeding or not succeeding in the world that is?
[1:53:15] It has limited success, for sure.
[1:53:18] I mean, it's spreading quite a bit, right?
[1:53:20] Yes, yeah.
[1:53:21] And would you say that Islam would follow what you would consider to be objective morality?
[1:53:27] No. See, there's too many flaws with it.
[1:53:29] Okay. So, your thesis that groups survive and flourish and do better because they follow objective morality has a challenge to it, right?
[1:53:38] Yep. Yep. There can be short-term success of less moral—I mean, Genghis Khan went around just raping and killing everybody, and he did pretty good for a while.
[1:53:47] He's still in the Mongolian currency and they still have statues all over the place.
[1:53:52] Right. Right.
[1:53:53] And I mean, Islam, it's been like, what, 1400 years, right? So.
[1:53:56] Yep.
[1:53:57] It's had, it's had some time. It's not like a, it's not like a new kid on the block, right?
[1:54:01] Right.
[1:54:02] So yeah, so you, you, you certainly have a challenge if you're going to tie morality into consequentialism, which is genetically Genghis Khan did very well.
[1:54:13] Yeah.
[1:54:13] Right. And in terms of spreading the Napoleonic code, Napoleon did very well, uh, in terms of, you know, communism ran a third of the entire planet or quite some time. And it still runs, you know, major, uh, countries and, and cultures it's done, done quite well. And certainly the people in charge of communism were very happy to be in charge of communism. Otherwise they would have changed it to something else, right? So why, if people succeed by being selfish or greedy or violent, why is that wrong?
[1:54:49] Because I think that success is limited. I mean, this is fundamentally a belief of mine, but, you know, again, I think like we're coding, we're going to make this thinking AI thing. And the one with the true objective morality is going to spread throughout the universe the fastest and gobble everything else up. So I think in the end times, like good wins.
[1:55:15] No, but you can't just have magic. I mean, this is sort of my suggested strictness to your thinking. I think that you're kind of indulgent. I mean, you're a very smart guy. Your language skills are fantastic and your intelligence is great. And so I'm spending time here because you have great value, right? And I want to encourage you in all of that, but you can't just have magic. So you can't just say stuff like all of the universe is going to turn into a thinking machine and say, well, I'm going to somehow magically overcome the problem of the speed of light because ancient Bushmen didn't think we could get to the moon. If I said, if you were an investor, right? And I said, I am going to become number one in housewares in every country with no marketing spend over the next one month. Would you believe me? Right. And if I then said, well, but ancient Bushmen didn't think we could get to the moon. So who are you to tell me I'm wrong? Would you invest in my company?
[1:56:19] No, no.
[1:56:20] So that's how you sound.
[1:56:22] Right. I know. I'm not going to sell anybody on this. This is just sort of my moral philosophy.
[1:56:27] No, I asked you for your most creative idea and your most original idea. I'm just telling you that do not have magic in your thinking, right? So if you say, well, the universe is going to become a big computer, you have a challenge, which you should challenge your own arguments and ideas. The first thing that I do by the by, if it helps at all, the first thing that I do when I come up with an idea is to say probably bullshit probably wrong uh let's let's think of every argument against it right and if i come up with a new idea i will often get together with donors and so on and say okay i've got this idea what's wrong with it what you know what doesn't work if i'm going to have a debate i say you know what could be how could i argue better what could how the how could the arguments go against me and that kind of stuff so if i have thought about that so if you if you have the ability to, reason and to think and to communicate, which you have in spades, which is admirable and great, then the first and harshest skeptic of your own ideas has to be you. You have to look at what doesn't work first and foremost, and you can't just say, Bushman didn't think we could get to the moon, therefore I can insert magic here, and it's all going to be fine. That is not going to be credible to people who are self-critical, if that makes sense.
[1:57:45] That makes perfect sense. Yes, yes. So I have thought about it, if you'd like to hear an explanation.
[1:57:52] But why wouldn't you have said it before?
[1:57:54] I don't want to waste more time. No, no, so I mean, so this is part of...
[1:57:57] No, no, hang on, hang on. But I did ask you about this before. I said, this seems like impossible, and you said, well, but getting to the moon was impossible, and ancient tribes didn't think... You already gave me an answer. You can't give me another answer now.
[1:58:08] I gave you a reductionist answer, but I have given it some thought, and it's, you know...
[1:58:13] But then why, if you had a better answer, why didn't you give me a better answer?
[1:58:17] I was just giving you a quicker one.
[1:58:19] Well, don't do that.
[1:58:20] Okay.
[1:58:21] Because don't waste my time with bad answers if you have good answers.
[1:58:24] All right. So, I mean, this is an imperfect answer too, but it's at least more approaching what you're asking for, right? So, I mean, one of the atheist arguments is like, why do bad things happen to good people if there is this Ouroboros god of the universe, right? And part of it is, as you were saying, like this law of entropy, it's sort of like sentient life is in a race against it. So this law of the natural universe is like it's in perpetual decay. It's racing away from itself as fast as possible. So life emerged, and it took a long-ass time, and evolution happened, and now we have cultures competing to find this objective morality. And I believe in aliens, so I think there's a big race out there all over the place where this is happening. And, you know, why do bad things happen to good people? It's because the universe itself, this perpetually reigniting thing, has to have this race against entropy such that it's laws of physics and stuff that make this race to make this sentient AI. It has to have bad things happening to good people just because this natural law of competition has to be there because it can't scoop up all that energy to make another big bang and start itself over again. It has to just grab as much as it can before the universe gets too stretched out and then boom, do it again.
[1:59:49] So innocent children have to be raped because physics?
[1:59:54] Basically.
[1:59:55] Yeah. Don't do that. Don't do that. I'm telling you, man. And also, you know, I assume that you want to go out with women at some point. Don't say you're not sure if rape is wrong. So, you know, if you say to some, and then we can't put, we can't put people in jail, right? You said earlier, well, this person should be in jail. I mean, how could they be in jail if they're fulfilling the physical requirements of the university's renewal or you got, you got to, you got to start from scratch, man. And again, you're a smart guy and you got a lot to offer. I would recommend my books, Essential Philosophy. I would recommend The Art of the Argument. Essential Philosophy is free. The Art of the Argument is pretty, pretty cheap, but you spin a lot of stuff without a lot of discipline. And I don't think that you're thinking through the consequences of what you're saying. If some kid gets assaulted and raped by some horrible gang and you say, well, you know, uh, that's just part of the universe seeking to reinvent itself. It's like, there's a coldness to that. That is kind of chilling and I'm going to move on, but I would, you know, recommend, uh, looking into being more skeptical of your own ideas and be, be very careful.
[2:00:59] And I have to be very careful about this myself, which is intellectual people.
[2:01:04] We can talk ourselves into just about anything because we're so convincing, not just to others, but to ourselves. so in order to do good in the world we have to be our own biggest skeptics all right andre andre.
[2:01:18] I think I remember you from The Princess Bride. What is on your mind, my friend? If you want to unmute, I'm happy to hear.
[2:01:28] All right. I don't know if I think we're getting Andre. That's all right. Let's go with Dylan. You want to take us home, brother? What is on your mind? Going once, going twice. All right. We may be done. All right. Yes, hello.
[2:01:45] Oh, sorry. I thought I was done. all about that.
[2:01:49] All right. Well, thanks everyone so much for a great evening's conversation. I really do appreciate it. I'm not even going to give you any final thoughts because it was a really great conversation. And I hope you enjoyed the robust application of the against me argument in real time. But yes, if you want to use force against me for disagreeing with you, you can fuck off and keep on fucking off until you get to the edge of the universe proposed by the last guy, which apparently means you get to reboot and start all over again, but hopefully with less kitty diddling. So love you guys for a great conversation. It's freedomain.com/donate. I just put out chapter 14 of my new novels. One of my favorite chapters I've ever written, chapter 14 of Dissolution, which you can get. I also created a feed so you can just automatically get the new shows downloaded. And so you can get the feed if you're a subscriber. You can do that, of course. You can subscribe at freedomain.com/donate. Lots of love, my friends. Have a beautiful, gorgeous, lovely, wonderful evening. we will talk to you Friday night, and I appreciate you and love you all. Thanks.
Support the show, using a variety of donation methods
Support the show