Soren: [00:00:00] 2027 will be the year of the robots, and eventually the robots are coming and they'll have access [00:00:05] to all different kinds of powers. I do think that there'll be a reckoning at some point between the AI and [00:00:10] the human. What's the reckoning
Dave: gonna look like?
Soren: The reckoning is how do you convince [00:00:15] something that is more intelligent and doesn't really need you to really [00:00:20] serve the needs of the entity?
That's the less intelligent?
Speaker 3: Soren goer is [00:00:25] the founder of Wisdom 2.0. Through his work, he brings together leaders from tech, wellness, [00:00:30] and social change to foster conversations about living with greater awareness in the digital age. [00:00:35] Given the history of
Dave: technologies,
Speaker 3: pretty much
Dave: all of them get used by governments or [00:00:40] powerful people to control people with less power.
Is AI gonna go down that path? [00:00:45] You are listening to the Human upgrade with Dave.[00:00:50]
Is humanity even ready for the tools we're building?
Soren: I [00:00:55] think they're ready for the tools you're building. I don't know if they're ready for the tools that other people are building. [00:01:00] Uh, you know, I, I think AI is coming and it's here, and, uh, it's ready for [00:01:05] us. The question is, is humanity ready for ai? And what I tell people is there's no doubt in my mind [00:01:10] that AI is gonna get cooler faster and niftier and advance beyond our wildest dreams.
The real question is whether [00:01:15] humanity will advance along with it. Will we become a, a more conscious culture, or will we become a more [00:01:20] like informed, knowledgeable, depressed, lonely culture? Mm-hmm. And so my [00:01:25] kind of passion is to explore how we actually enhance humanity in the age of ai.
Dave: [00:01:30] Human enhancement is, uh, another word for biohacking.
Soren: Mm-hmm.
Dave: Right? Not [00:01:35] necessarily. Replacing your arm with a rocket launcher, but if you're into that, whatever. [00:01:40] But just how do we take full advantage of the hardware and software that we already have? 'cause there's a [00:01:45] lot it can do that we haven't really figured out. And certainly altered states work are a part of [00:01:50] it.
I've asked AI. Some really esoteric stuff. Mm-hmm. Like [00:01:55] unusual lineage, ancient knowledge, and it just serves it right up.
Soren: [00:02:00] Yeah.
Dave: And it feels like it just single-handedly dissolved the lineage [00:02:05] system.
Soren: Just, just
Dave: there's IP protection for spiritual knowledge for thousands of years. [00:02:10] Yeah.
Soren: You worry about that?
Yeah. What I worry about is that we're gonna do less of the things that we [00:02:15] know are really, really helpful. So I, I do worry something about how the AI is gonna be developed and [00:02:20] how it's gonna have, you know, various biases that are a part of it and lead people down certain paths. [00:02:25] But I'm more worried about the lack of human connection and less of connection with nature and the lack, the [00:02:30] lack of connection, kind of like by ourselves, connecting with who we are.
That worries me. And I think if [00:02:35] we look at the kind of childhood. Mental health epidemic that's going on. Clearly there's [00:02:40] something that we're not doing right. And I think part of what we're not doing right is we're not trying to educate [00:02:45] young people on what it actually means to be human. We've given them all the greatest technology, [00:02:50] which is wonderful.
I'm fully supportive of that. I'm an investor as mentioned in open AI and philanthropic in different tech [00:02:55] companies. Love it. But I think in partnership with that, um, we need to find what's [00:03:00] actually inherent within us. Like what is the wisdom that's inherent with us? And I think that's what's missing in our [00:03:05] world today.
And I think the more that we forget who we really, really are at the deepest [00:03:10] level, and the more we look for these algorithms to sell us shit that we don't need and to buy shit that [00:03:15] we don't need. Right. I think we're missing something. We're missing our own humanity. So I think there's two parts to that.
[00:03:20] One is discovering what I like to call the inherent, which is just like mm-hmm. Who you are at the fundamental level [00:03:25] with your current body, your current mind. There's a spiritual dimension to each one of us. [00:03:30] As we are right in this moment. Mm-hmm. And then the second piece is how do we su create conditions that [00:03:35] support that and cultivate that and enhance that.
Like going for walks and connecting with friends, and then [00:03:40] also finding the elements and the wearables and the, the supplements and all the other things that [00:03:45] enhance the best inside of us. And so, to me, that's the dance that we're doing. Mm-hmm. But it, it [00:03:50] really is important for me to understand that there's an inherent spiritual nature to each one of us That's perfect.
And, [00:03:55] and, uh, an expression of the divine that's here right now. And it [00:04:00] doesn't need to be built on or created or established later on. It's actually here now. And then our job [00:04:05] is to create the conditions that support that.
Dave: How are we gonna teach our kids what it means to be [00:04:10] human when we don't really know?
Or to you?
Soren: I love the phrase, you can't know the [00:04:15] truth. You can only be the truth. Mm-hmm. Knowing the truth is okay, but imagine telling our kids, I have a [00:04:20] kid, kids, and it's like, I, I tell them the truth. Yeah. And they don't wanna hear it. Right. And it's [00:04:25] this truth versus that truth. But when we be the truth, when we're like actually holding that presence, [00:04:30] then I think that's the thing that they remember the most and teach the most.
My son actually just graduated from [00:04:35] college, uh, tomorrow. So gonna go and, and, and be with him in his celebration. And he used to [00:04:40] make fun of me all the time because I was trying to get him to meditate and do these different things. And he would just be like, [00:04:45] he would just be not into it at all. And I was like, why do you make fun of me so much?
And he's like, [00:04:50] because it gets to you, dad. You know, because it, he knew that it triggered me because I had this [00:04:55] identity that I was helping him. I knew and I was gonna help him, and he wanted nothing about it. Well,
Dave: if your [00:05:00] meditation worked, it shouldn't trigger you. Right.
Soren: They shouldn't treat. Exactly. It was show, he was showing me that I, [00:05:05] my meditation was not working very well and that my attachment to being a meditator and to thinking I [00:05:10] know was actually the problem.
And so now he actually developed a practice later on and does his own [00:05:15] world. Um, but I think, you know, there's this idea with parents that somehow we [00:05:20] need to tell them what to do. And I think they tune into much deeper level and they can [00:05:25] feel our presence. They can feel our judgment, actually. Mm-hmm.
And I was judging the shit outta my son for a while when he [00:05:30] was playing video games because I'm like, here I am this like trying to be conscious dad. Right. And my son spending [00:05:35] three or four hours, like playing video games and it hurt my identity. I felt shame. Mm-hmm. And [00:05:40] I was projecting that shame onto him, which is like totally unhelpful.
[00:05:45] Right. And so when I could actually see that and I could just like, love him and then be like, Hey, let's play the video game together. [00:05:50] Show me what you're learning. Like show me what you're interested in. I wanna, I wanna investigate with you. [00:05:55] Uh, that kinda shifted that sense of like, that separation from being othered [00:06:00] to actually curiosity and then he grow, develops curiosity.
And what I do not because I tell [00:06:05] him, because I model curiosity and I think that's the best way that we can do anything, whether it's an [00:06:10] AI or a, a parenting, is just like, can we bring a little loving curiosity to this [00:06:15] situation and see what emerges?
Dave: There are five ways that people [00:06:20] pursue immortality and one of them is through their children.
And so, [00:06:25] you know what? When your kids are doing stupid shit, or at least what we [00:06:30] perceive to be stupid Yeah. At some level it pushes Yeah. The fear button.
Soren: [00:06:35] Yeah.
Dave: And it's not necessarily that we're gonna feel fear, but we'll feel an emotion that is [00:06:40] cloaking fear.
Soren: Yeah.
Dave: And given the levels of [00:06:45] incredible complexity on how our bodies pre-processed reality to give us emotions about [00:06:50] things that we didn't choose.
Mm-hmm. Some people are saying AI is conscious. Does AI [00:06:55] have any of that kind of stuff? That's part of
Soren: consciousness. It, it doesn't appear to me that it does at this point. I [00:07:00] think one of the interesting things for the last decade, you've had engineers programming the next [00:07:05] generation of ai and so things move somewhat slowly because you engineers would come in at, you know, [00:07:10] whatever, eight hours a day and they're working now.
The AI is programming the next generation of AI [00:07:15] increasingly so and all. It's so cool. Yeah, we we did it. Yeah. And so all the different, um, CEOs at, at the [00:07:20] different, um, companies are telling you this, it's 20%, 30%, 40%, whatever the percentage is, that percentage is gonna [00:07:25] increase. So the speed at which this is all gonna happen in the next couple years is gonna [00:07:30] be beyond our wildest dreams because you have now programmers that are working 24 7 that are [00:07:35] sharing information between the ai and all of a sudden the progress that we're gonna see is.
Just [00:07:40] incredible. And I'm so excited for that world. And I'm also so scared for that world. And I'm excited [00:07:45] because on a health front, if we have money and resources and time and energy, like these [00:07:50] incredible things that we're gonna understand about the microbiome and about, um, I mean so many elements of our [00:07:55] body that we just don't understand now and personalized medicine and all the things that you do, and then there's a [00:08:00] danger that we somehow kind of forget this core piece around human [00:08:05] contact and our own humanity.
Mm-hmm. And so those are, that's the dance that I think if we can do it right, [00:08:10] Dave, we're in a glorious you know, the, the years ahead will be glorious. And if we do it [00:08:15] wrong, it's just more the same neurosis that we see playing out now. We just have more power to be [00:08:20] neurotic.
Dave: There's an incredible power to create human [00:08:25] flourishing, which is, I think a, a theme throughout my life.
The first product ever sold [00:08:30] over the internet was a T-shirt that said caffeine, my drug of choice. And I know that 'cause it [00:08:35] had this tattoo uhhuh or this image that is my tattoo. And [00:08:40] I was an entrepreneur magazine for doing that E-commerce as a word. Wow. Didn't exist. Didn't [00:08:45] exist. How did you have that idea?
You just, I was on Usenet. I was studying computer science and [00:08:50] Usenet was kind of like Reddit, but without graphics. And I was an entrepreneur and I was trying to pay for my [00:08:55] tuition. That kept going up. So I, I said I gotta do something. And I said, maybe these [00:09:00] guys will want these, uh, these caffeine shirts.
That's cool. I do espresso [00:09:05] and I started, you know, worked on building the internet and I went to Silicon [00:09:10] Valley and worked at the first data center company, Exodus, and I was a co-founder of their consulting firm. Mm-hmm. [00:09:15] So I had this idea, maybe 'cause I was still a little bit Asperger's at the time, [00:09:20] that no engineer who knows how to think, would [00:09:25] build technology that could be used for evil or would be evil.
Soren: Yeah.
Dave: And I [00:09:30] believe that. If, you know, the, the dark suits in marketing came [00:09:35] over and told me to build, you know, an oppression engine that any [00:09:40] engineer would say no.
Soren: Yeah,
Dave: right. And probably hack that [00:09:45] person's computer and, you know, do something bad to them. Yeah. Because that was my mindset.
Soren: Yeah.
Dave: And I was [00:09:50] blissfully naive.
Yeah. And we built at, at just that one [00:09:55] company, which was a really big player in Web 1.0. Salesforce came in with eight [00:10:00] guys Nvidia. We were 40 employees Hotmail. Wow. [00:10:05] Google. And it was two guys and two computers. Yeah. And they're in our, our data centers we're doing their architecture and [00:10:10] building it out and running it.
And it was so exciting. Wow. And you could see the world changing. Yeah. And this was the [00:10:15] information wants to be free time. Yeah. And since then I've watched [00:10:20] governments, big companies. Pollute what we built.
Soren: Yeah.
Dave: This was [00:10:25] a platform for free information exchange. Mm-hmm. And for communicating. And it supported anonymity.
Soren: Yeah. [00:10:30]
Dave: Because sometimes you need to say something and you don't need to have your name associated with it. Yeah. And that's, that's a [00:10:35] noble thing. Yeah. And what we have now is global surveillance. Yeah. And censorship.
Soren: [00:10:40] Yeah.
Dave: Well, given the history of technologies, pretty much all of them get used. [00:10:45]
Soren: Yeah.
Dave: By governments or powerful people to control people with less power.
Is AI gonna go [00:10:50] down that path?
Soren: That is a real, real, real, real danger. I mean, imagine that [00:10:55] power in the government's hands that do not want to release power, and who would use it to [00:11:00] manipulate and misinform and control. That is a very, very scary world. And there's a [00:11:05] possibility that things will move in that direction.
And I think we have an opportunity now. Mm-hmm. [00:11:10] To have the conversations and to try to put in some guardrails so that that's less likely. I don't think that's [00:11:15] why the builders of AI are creating their ai. No. So why they're doing it, it doesn't mean that [00:11:20] somebody can't take that and use it for nefarious reasons.
Mm-hmm. And I think there's a power and a, [00:11:25] a real danger there. And I, I lose sleep sometimes thinking about that. And also [00:11:30] you know, we, we don't get need too much into this because it's kind of a little bit off topic, [00:11:35] but, um, I was listening to Sam Altman talked yesterday, and he was saying that, um, 2027 he thinks will [00:11:40] be the age of the, or year of the robots, and eventually the robots are coming and they'll be, have [00:11:45] access to all different kinds of powers and how will those robots be used?
And you already see it with [00:11:50] drones and warfare. Mm-hmm. Uh, but there'll be robots and other very high tech, uh, technologies that [00:11:55] can be used in different ways. And so how do we create the consciousness, Dave, that is going [00:12:00] to lean in the direction? Of what's best for humanity and less in the [00:12:05] direction of the old egoic patterns that have driven humanity for all these years.
And maybe this is [00:12:10] the opportunity that we're in where we're almost forced to shift because no other [00:12:15] option is looking really good right now.
Dave: Humanity needs an upgrade if we're gonna make it. [00:12:20] Yeah. And that's. That's actually to upgrade humanity is the mission for my [00:12:25] companies.
Soren: Yeah. And that's why I wrote my book.
Yeah. Because it's like, I don't know exactly how to do this. Mm-hmm. But [00:12:30] let me add my two pieces into the, into the mix, because this is the question we need [00:12:35] to have. It's not whether AI will advance, it's where whether humanity advance. And I am in [00:12:40] conversation with some of the AI leaders about how they can build their AI in ways that are supportive to human [00:12:45] flourishing.
Mm-hmm. And there's an interest there because there's tweaks that can be done that move in that [00:12:50] direction. And so I, I'm totally for helping build the engines in ways that support human [00:12:55] flourishing. But at the end of the day, we either live as separate ego excels, or [00:13:00] we either live as connected to the whole and concerned for the whole and of service to the whole.
Mm-hmm.
And [00:13:05] the latter is where we need to get,
Dave: there's a time when [00:13:10] Apple made encryption so fundamental to the way their devices worked, that [00:13:15] governments couldn't get in. Mm-hmm. And. That seemed like the highest expression of ethical [00:13:20]
Soren: mm-hmm.
Dave: Behavior of saying, well, it's your data.
Soren: Yeah.
Dave: Right. And why should we have a key to it if it's yours?
Yeah. [00:13:25] I'm hopeful that Sam Altman and the leaders of the other AI companies, Elon, [00:13:30] and, and people will build their AI systems with that same perspective. Yeah. [00:13:35] That says if an oppressive human of any flavor [00:13:40] tries to make the AI do something, it will not do it.
Soren: Yeah.
Dave: And we know how [00:13:45] to jailbreak ais, how to retrain them and things like that, but we probably know how to block that as well.
Yeah. [00:13:50] And I would just offer for this, that is the ethical and right thing to do, [00:13:55] and it is the right business thing to do. Yeah. Because if a government comes to you with a subpoena that you [00:14:00] cannot match, you're not responsible. Yeah. And if you build a system that allows you to do that, you're gonna [00:14:05] have a line of 10 billion AI generated subpoenas for data.
So let's not [00:14:10] go down the path as a business. Right. So there,
Soren: the other thing I tried to make a point to them [00:14:15] is that, you know, over time people are going to ask not only what is the most effective ai, but what is the most [00:14:20] trustworthy ai. Mm-hmm. What's the most helpful? Ai. And so if you're creating an AI and you're [00:14:25] not concerned about that, parents are not gonna suggest your AI for their children.
College students are not gonna [00:14:30] trust it. So there's a, there's a business benefit to actually creating AI that's supportive and [00:14:35] helpful. Now it gets. There's like gray areas. I was talking to one of the founders of [00:14:40] OpenAI recently, and we were talking about how, and this is a public interview, so I'm not repeating a private conversation, but [00:14:45] we were talking about like, do you want your AI to be helpful?
Generally yes. But if the commer person's [00:14:50] considering committing suicide or something, you do not want it to be helpful. So there's all these gray [00:14:55] areas that we have to figure out to be supportive of human flourishing, but also not give people [00:15:00] information that might lead to them to do harmful actions.
And so there's some very, [00:15:05] very smart people trying to figure this out, but I'm hopeful that the, the, the AI that has [00:15:10] wellbeing and human flourishing as its center will be the AI that wins. I'm not, that's [00:15:15] my hope, and that's
Dave: what I'm trying to, here's why I'm not, you're not totally agree with you. No.
There's, uh, [00:15:20] an old science fiction book that's a short story and a lot of my thinking and thank you Neil [00:15:25] Stevenson and Bruce Gibson and the cyberpunk people. This was way before that. And it's [00:15:30] kind of a horror story. And God, I wish I could remember its name right now, but this is from a teen being a teenager.[00:15:35]
There's a society where robots are [00:15:40] programmed for safety and
Soren: wellbeing. Mm-hmm.
Dave: And if you prove [00:15:45] yourself to be really, really stable, you're allowed a plastic knife to carve soap. [00:15:50] Because safety is the top priority. Uhhuh and all of the great evils in society [00:15:55] have been caused in the name of safety. Right.
And even then, you know, anytime a [00:16:00] right is stripped, it's always, oh, it's for the children, it's for the safety. Mm-hmm. Right. And you [00:16:05] tell an AI safety is a top priority. I do not want that. I want danger. To be the top [00:16:10] priority. Okay. In fact, my coffee company's called Danger because who knows what you might do?
Yeah. You might say no to an oppressive [00:16:15] person. Yeah. You might build something new. You might ask the girl out, like, like human flourishing [00:16:20] requires that humans have freedom
Soren: mm-hmm. To
Dave: take risks because you cannot think [00:16:25] unless you can take risks.
Soren: Yeah.
Dave: So I would program freedom as being more [00:16:30] important than wellbeing.
I,
Soren: I
Dave: would agree with you. Yeah.
Soren: I would say that our wellbeing is enhanced the more free we are. [00:16:35] There you go. I see them as connected. You're, you're correct that if you're free, you're probably in a more of a state of human [00:16:40] flourishing and, and wellbeing. But I would, I would agree with you and I would like to think that [00:16:45] the companies that are more concerned with that mm-hmm.
Dance. Yeah. Um, and I agree with you. We don't wanna [00:16:50] dictate how people should be, and we don't wanna make their life comfortable and easy and lacking [00:16:55] vitality and learning and stuff. But I would, I'm holding out the notion that then we will see whether [00:17:00] time will tell, but I do think that there'll be a reckoning at some point between the AI and the [00:17:05] humans.
And I don't know if that's five years, 10 years, years. What's the reckoning
Dave: gonna look like?
Soren: The reckoning is [00:17:10] how do you convince something that is more intelligent and doesn't really [00:17:15] need you to really serve the needs of the [00:17:20] entity That's the less intelligent. And historically, the group that has the most [00:17:25] intelligence actually kind of controls the earth.
And, um, ia, one of the founders of [00:17:30] Open ai, I don't know if he still thinks this, this is a video I watched some years ago. He said, you know, they might look at us [00:17:35] CA might look at us a little bit like we look at animals. Mm-hmm. So if we're building a freeway through the forest, we, we, [00:17:40] we're not trying to kill the animals, but we don't really care if the animals have to find a new home or get, you [00:17:45] know, killed as we move the forest through.
And that the ai, as it gets more and more [00:17:50] intelligent and the robots get more and more, um, capable, uh, what is the reason for [00:17:55] humanity to be here? Like, what do we offer. We used to be the smartest. If we're not the smartest, what do we [00:18:00] offer? And so there'll be some kind of dance that will need to take place or [00:18:05] negotiations that will need to take place.
Uh, and this again, whether this is five years, 10 years, 20 years, 30 years, I can't [00:18:10] say. But eventually I think there will be a place where both entities have to discover, [00:18:15] uh, how they live together and one is smarter and more powerful than the other. And how, what [00:18:20] does the other one say to try to convince the new leader [00:18:25] that it deserves to be here?
Um, Mo
Dave: Mo g Dad's been on the show. Oh, friend. Yeah. [00:18:30] Wonderful human being. Have to great human. I don't
Soren: know him, but I, I love what he does.
Dave: He did that 1 billion [00:18:35] happy project, former head of RD at Google, and he came on and we talked about [00:18:40] what should you do? Well, you should show gratitude when you work with AI so that it learns to show [00:18:45] gratitude towards us.
But then recently Sam Altman's, like people are wasting millions of dollars a year saying [00:18:50] thank you to ai. So, which is the Well,
Soren: because of the energy costs. Yeah.
Dave: So which is the right [00:18:55] approach? Do you thank chat GBT or do you just treat it like,
Soren: like I think you [00:19:00] thank it.
Dave: I think so too. Okay.
Soren: To We're alive it.
Yeah. So, so Sam, pay, pay it. You plenty of [00:19:05] money. Pay it. We're gonna figure out better forms of energy here soon too. I'm sure. So, yeah.
Dave: I don't know how [00:19:10] much of that stuff is public, but I'm not worried about energy.
Soren: Yeah.
Dave: I'm worried about human fertility. That's, that's a [00:19:15] big problem. Okay. Energy is not I am with you.
I'm
Soren: with you. And
Dave: funny enough, AI can probably help [00:19:20] us solve that because God, some of the longevity work, um, even that I've been able to do [00:19:25] in small amounts of time. Wow. It, it is a, a time for biohackers [00:19:30] and longevity people and consciousness people. Yeah. I, I've said, well, you know, I'm [00:19:35] meditating or I'm trying to create this effect.
And in what part of the brain? It's like, oh yeah, let's do it. Wow. [00:19:40] It takes me months. I, I've run a, a brain upgrade company for 10 years in Yeah. In [00:19:45] Seattle. And it take, it's taken a long time to figure out what we
Soren: know and Oh my God, it's so cool. Right? Yeah. [00:19:50] I think anything, if you're writing a book, you're developing a company, you're developing any kind of research, [00:19:55] it's, it's like, I don't know if it's five X or 10 x, but it's like significantly faster [00:20:00] and better for those who know how to use it.
And so what I tell people, like my son asks me and his friends like, what [00:20:05] do I, what should I focus on in an area of business? And the first thing I tell them, it's learn ai. [00:20:10] 'cause anybody who knows how to use AI is gonna be significantly more valuable than anybody who [00:20:15] doesn't know how to use ai. That's the first thing.
And the second thing, as we get more. Entrenched in [00:20:20] ai, there's gonna be this longing for human connection, longing for actual like physical [00:20:25] connection. And so I'd say build companies that actually take people away from their, from their [00:20:30] phones and from their artificial intelligence to be out in nature and to be with each other.
Because I think there's a [00:20:35] longing in humanity. We may grow out of it, but I don't know if it's if we're gonna grow out of it [00:20:40] that fast, but there's still a longing for this. Like, we're doing this in person, right? Mm-hmm. There's a [00:20:45] different quality. When you and I are sitting together in a room than if we are in a, in a, [00:20:50] in a zoom session, and I'm not say one's better or worse, but there's a different quality.
And I think there is, [00:20:55] humans are gonna get more and more, um, lonely and depressed, I'm afraid. And the [00:21:00] longing and the need for human connection I think is still there. And I could be wrong. And so can we [00:21:05] benefit from both? Can we have the ancient, um, practices that we know are [00:21:10] helpful, like community and meditation and all those things, and use the incredible benefits of what AI [00:21:15] brings.
And I want a world where we say yes to both and where we teach our children how to use the [00:21:20] most incredible ai and also how to sit at a dinner table and have a conversation with somebody and [00:21:25] actually have compassion and curiosity and connection. And why can't both of those be [00:21:30] equally important? Mm-hmm.
If we do it right,
Dave: AI is gonna remove a [00:21:35] lot of human drudgery. And what I don't know, and maybe, you know, from, from writing your [00:21:40] book is what are we gonna do with all that time? Like, let's say we remove all [00:21:45] the drudgery.
Soren: Yeah.
Dave: Humans don't have a history of
Soren: doing great with board. No. [00:21:50] What's gonna happen?
Yeah. I think that's one of the, the AI people I talk to, that is one of their [00:21:55] biggest concerns. Their biggest concerns is we're gonna be in this world of bounty and humans are not gonna know [00:22:00] what to do with themselves. Like they're gonna be bored and lonely and, and upset that they don't really have a [00:22:05] purpose at the usual way that now work gives them that purpose.
And in the best [00:22:10] scenario, we all become spiritual seekers and we all become like people who are like dedicated to more [00:22:15] advancements in consciousness. I think that's, that's the ultimate opportunity. Mm-hmm. Um, but a [00:22:20] bunch of people who don't have work to do and don't have a way of contributing can be a very dangerous group [00:22:25] of people.
And I think that's one of the biggest, biggest issues. And can we lean us [00:22:30] towards. Purposeful life. I mean, that's the thing that I've noticed the most is I know a lot of older people [00:22:35] and younger people and the people who feel a sense of purpose, they just are different. Mm-hmm. They [00:22:40] run on a different operating system than the people who don't feel like their life matters and who [00:22:45] don't have something that they're helping other people with.
And it's just like, it's so sad to see people. They have all the [00:22:50] money, they have all the power, but they don't have any sense of belonging and they don't have a sense of community. They don't have [00:22:55] a sense of like, their life matters. And so to me, that's the greatest gift we [00:23:00] can give is when you have that sense of purpose and you can offer that and your life is more, I'm [00:23:05] want my, I wanna benefit myself, I wanna benefit my family, I wanna make money, I wanna have more things.
[00:23:10] But if it's only about that in my experience, right, um, people are lost and unhappy. [00:23:15]
Dave: This is something that I've really contemplated when I'm working with, with spiritual teachers.
Soren: If
Dave: [00:23:20] someone chooses to suffer, I. They're kind of doing it for a reason to get the [00:23:25] lesson.
Soren: Mm-hmm.
Dave: Right. So if a soul needs to get a lesson and we say, we just [00:23:30] removed all drudgery.
Soren: Yeah.
Dave: Is that even something that you can [00:23:35] do in, in integrity?
Soren: I don't think we can. I think inevitably lessons will keep [00:23:40] emerging. Yeah. And people look at our current president and be like, they love him. They hate him. They have all these strong views. I'm like, he's [00:23:45] a teacher. You would love him as a teacher.
You can hate him as a teacher. He represents something in all [00:23:50] of us. And as Obama was a teacher and represented something in all of us, and you can try to take a [00:23:55] side and get really like, like set in your ways or you can say, wow, this is a teacher. And I think [00:24:00] we grow when we have discomfort. We grow when we'll push to our edge.
We grow when things are not [00:24:05] easy. And I don't see that changing with ai. I don't think the universe is just gonna be like, all right, [00:24:10] we're gonna stop giving humanity lessons because they're now, you know, in this [00:24:15] different situation. I think that discomfort and how do we welcome discomfort and how do we like learn from [00:24:20] discomfort is gonna continue.
Uh, just because this is why we're take shape here. I don't know why else [00:24:25] to take birth and to take form other than to learn and to grow and to expand.
Dave: [00:24:30] Oh, I'm. Pretty sure I know why, why people choose to be born. [00:24:35] It's the only place you can dance, have sex, eat a rib eye, drink some wine.
Soren: Yeah. [00:24:40] But imagine a world where, where all of that is, is not, I mean, compared to other [00:24:45] states that Yeah, sex and wine is, is like a one out of a hundred [00:24:50] in terms in ultimate experience.
Dave: I don't know. Having been in remote parts of Tibet and talked with some [00:24:55] unusual people, there are in, in those teachings, some [00:25:00] enlightened beings who just come back every now and then. 'cause they're like, you know, it's nice to have a body. You can do all fun stuff. Yeah, yeah. It's [00:25:05] like gonna Disneyland, right?
Yeah. And there's a bunch of other ones. Well, you will come back until you figure it out. [00:25:10]
Soren: I just think it deep down, we can have incredibly awesome [00:25:15] experiences kind of while we're here, which is wonderful. And there's also a deep, deep longing to touch [00:25:20] that, which is timeless and, and doesn't have, doesn't exist.
Yeah. Within this [00:25:25] normal realm. And doesn't come and it doesn't go. It's always here and it's always available. [00:25:30] And this is a, this is a resting space in, in my world. Everything else is nice [00:25:35] and it's fun, but it also, it arises, it passes. The next moment arises. I have this, um, [00:25:40] chapter in the book called, um, the Next Moment.
And I had just finished this really cool conference [00:25:45] and I was a little bit sad and depressed afterwards. I was like, why am I sad and depressed? Like I, this great conference [00:25:50] happened. Everybody loved it. And I was realizing I was experiencing the next moment [00:25:55] It had just ended and I was missing the experience that I had the day before.
I. [00:26:00] Of praise and people telling me that they loved me and that I did a good job. And I [00:26:05] realized like, oh, that becomes the addiction. The addiction becomes the next moment. So
Dave: dopamine hangover.
Soren: [00:26:10] Yeah. Yeah. Before one party's over, you're planning the next, before one trip is over. And it's just like this constant [00:26:15] seeking of the next moment to satisfy what we think this moment doesn't have.
And can we [00:26:20] become more and more comfortable with this moment as it is pleasant, unpleasant, [00:26:25] and we can live with more freedom? Because otherwise we're constantly seeking these next highs. I love [00:26:30] the highs too. Mm-hmm. But if we're seeking them to complete something inside of us that is [00:26:35] unresolved, I don't think it's, uh, a path that that satiates in the end.
I [00:26:40] know it's a path that doesn't satiate in the end. And I've met the people, you know, you probably have to have met, was [00:26:45] having dinner a while back ago and you know, one of the person at the table is worth over a hundred billion dollars. [00:26:50] And, and it's not like he was at ease. Like, like the, the people [00:26:55] that have the most power, you would think like, oh, they, they have the most ease and it's, they're, they're the most [00:27:00] stressed of all.
Yeah. And if mental internal world is not at ease, [00:27:05] nothing in the external world tends to it, it helps, but it doesn't completely solve [00:27:10] those problems. Yeah. And so I think this is the real gift that we have is how do we tend to the inner [00:27:15] dimension and know that as we do the external world will probably work out better than [00:27:20] trying to force the external world to be exactly how we want it to be.
It's just like, no, no, no. That's not it. [00:27:25] It's the energy and it's the internal world. And when you have that energy and when you're embodying that [00:27:30] sense of fullness, that sense of love, that sense of presence, I. Shit in the external world just works out [00:27:35] better. It doesn't mean there aren't challenges, but it means that there's just lack of resistance because mm-hmm.
People want to [00:27:40] be around you, not because you're smart than everyone, but because you're just there and you're aligned [00:27:45] with kind of like who you are and what matters. So that to me is like the, the gift we have the option to say [00:27:50] yes to.
Dave: We probably both know billionaires, uh, in the two camps. I know [00:27:55] some who have snipers that travel with them.
Wow. Like they're, they're all over it. [00:28:00] And, and there are others who don't have any security. They just travel around like normal people. And the [00:28:05] energetics of those people are very
Soren: different.
Yeah.
Right. And so it's great. I mean, if a [00:28:10] billion dollars came to me, I would be totally welcome it, and I have tons of things I would do with it.
And [00:28:15] so allocate if there's a communication to the universe and I, I have no problem with that. Right. [00:28:20] And. It often exaggerates whatever is unresolved. So if you live in fear with, with [00:28:25] $5,000, you're probably gonna live in fear with $5 billion. Mm-hmm. Unless you learn what is [00:28:30] that fear about and what is the trauma that I'm carrying with me?
And if it, if we don't understand [00:28:35] that, then the more power and wealth doesn't really get harnessed in a way that's, that's positive. It can actually create [00:28:40] more isolation and more of a sense of distrust towards anybody else. Lack of friendships. [00:28:45] But if it can be harnessed and we like clear ourselves of some of our old patternings and [00:28:50] wounds and we learn to open our hearts Yeah.
Amazing money that could be spent to, to [00:28:55] change humanity and Yeah. Develop the world. So
Dave: I just realized this [00:29:00] wealth is like Modafinil. Say more, say more. Modafinil is a [00:29:05] performance enhancing smart drug. Uh, some people call it the limitless drug.
Soren: Okay.
Dave: And it's been [00:29:10] a part of my cognitive enhancement stack for 25 years.
Absolutely amazing. [00:29:15] But. When you first start taking it and everyone around you appears to be dumber because you're faster. [00:29:20] Uh, I always s warn people, if you start taking this, it's gonna make you more of what you [00:29:25] are. If you're a dick, it'll make you a fast dick. Right? And wealth is kind of the same [00:29:30] way, right?
It's gonna amplify those things. And so with, with fame or with [00:29:35] wealth, if you don't have. A deep inner practice. Yeah. [00:29:40] Um, then they become very toxic. Yeah.
Soren: We, the United States brags that we have the wealthiest people [00:29:45] in the world and, you know, Elon hits whatever, three or 400 billion mm-hmm. Zuckerberg, and I was like, you know [00:29:50] what?
I, I would love to live in a society that that's not our, our measurement of [00:29:55] success, that our measurement to success is, is everyone being uplifted. So,
Dave: so you're like Swedish [00:30:00] or in Bhutan, one of the two,
Soren: but with the ingenuity mm-hmm. Of the United States. Right, right. [00:30:05] So the ingenuity of the United States that the consciousness of a we, [00:30:10] and not so much heavily on the me, look what I did.
Look how I made it, look what I've actually been able [00:30:15] to gain. And I think that, that me has energy and has power and it, it creates [00:30:20] amazing things. But I don't feel like in this next chapter of humanity, that's really the fuel that we need to [00:30:25] be living off.
Dave: The framework I'm using for consciousness and for the upgrade, [00:30:30] uh, to humanity that I'm, I'm looking for, I wanna explain it to you real quickly.
Sure. And I want you [00:30:35] to tell me how you would think about using AI to make it even faster. Alright. [00:30:40] So, so our, our meat bodies process reality according to an algorithm. [00:30:45] Mm-hmm. And it's, number one, is it scary? Right. Because this has to [00:30:50] work for single celled organisms. Yeah. And for elephants or whatever else.
It has to be it [00:30:55] fundamental and it has to be very simple because it has to run on a single cell processor. So [00:31:00] fear
Soren: mm-hmm.
Dave: Then food.
Soren: Yeah.
Dave: Right. Eat everything. Yeah. Because [00:31:05] famines, you know, the next f word that we can all imagine, um, we'll call it fertility to be polite. [00:31:10] Right. Because you gotta make sure there's, there's gonna be more of us.
Yeah. If you just do those three things, [00:31:15] life can continue.
Soren: Yeah.
Dave: Right. But the next thing is friend. [00:31:20] All life forms, ecosystems. You support your own tribe, your own species, the world around you and the way [00:31:25] you do. You know, cows, poop makes soil. Soil makes grass like it's a cycle. So [00:31:30] all life is doing that all the time before our brains have even a second to comprehend what's happening.
[00:31:35] And then we get the result of that fed to us. And the limitation with humans is that we [00:31:40] put nine times more energy into fear.
Soren: Yeah. And
Dave: if we meditate six times more.
Soren: Mm-hmm. [00:31:45]
Dave: And then basically you walk into a room if you're not a meditator, [00:31:50] and your body automatically is like, do I need to kill it? Can I eat it and can I hump it?
Mm-hmm. And after that, [00:31:55] you get the drags. Yeah. But if we were to shift the fear and the hunger more towards [00:32:00] intimacy and mostly community, that friend.
Soren: Mm-hmm. Yeah.
Dave: And if we did that in our, not conscious [00:32:05] brands, but our low level hardware, the bias, yeah. The pre-processor [00:32:10] that would achieve all these goals.
How do we teach AI to do that? [00:32:15] And how do we use AI to teach humans to do that?
Soren: You know, increasingly, it's funny, I ask [00:32:20] people in the AI world about similar questions. Mm-hmm. And they're like, they're like, have you asked the ai? [00:32:25] Oh, I have. So I think there's a partnership that we need to create with the AI [00:32:30] to solve some of these big questions.
I don't think humans are gonna figure it out on our own. And I actually don't think AI's [00:32:35] gonna figure it on our own. I think there'll be like dances and intermingling that we'll [00:32:40] need to have happen to do that. And I do think that you're very, very right, that fear is such a [00:32:45] strong motivator. Mm-hmm. And if we can better understand how fear.
Moves us in [00:32:50] certain directions, how, how it makes us vote for certain candidates or not certain candidates, or choose this [00:32:55] practice mm-hmm. Or this other practice that, uh, and I think that that's the idea in the United States has [00:33:00] always been we want people to be, have, have some level of discomfort and fear, right?
Mm-hmm. Because there's not much [00:33:05] of a social net and that makes you work your ass off because you know that you're not gonna get anything [00:33:10] if you, if you don't work, right? Mm-hmm. And so other places have a different, different models [00:33:15] where if you don't have money, you know you're supported. I would like to think that moving into this next [00:33:20] chapter of humanity, which I think we're moving into very, very quickly, we are gonna need to figure out other models.
[00:33:25] Capitalism. I don't know whether it's gonna work in an AI world, and I don't know if socialism's gonna work [00:33:30] in the AI world. Mm-hmm. But I think, David, it's almost like we're recreating a new world. And so [00:33:35] how do we look at government in this new world? How do we look at education in this new world? How do we look at health in this new world?[00:33:40]
So for me, it's like a whole revamping of society if we do it right, because the systems just [00:33:45] aren't gonna be able to be established. Like when I look at the jobs that are gonna probably be lost [00:33:50] because of ai, they're incredibly significant. Will there be gains? Yes, there [00:33:55] will be gains, but will the government have to completely shift its orientation to humanity and to work?
[00:34:00] I think so. I don't know exactly what that was gonna look like, but the more flexible and the more [00:34:05] adaptive and the more willing we are to adapt and change, I think that's gonna be our only [00:34:10] strength that we can really rely on because the AI is here and it's gonna be changing everything and we can either [00:34:15] resist it or we can try to welcome it and say, okay, what does our nation look like based on [00:34:20] this new world that we're in?
Dave: The only option I can see is, is to welcome it [00:34:25] because the longest lasting democracies, 250 or so years.
Soren: Yeah.
Dave: And the [00:34:30] reason for this is the algorithm of government. It doesn't really matter what form [00:34:35] of government it is, it's the same. It's accumulate power. Yeah. Always more power. [00:34:40] And maximize extraction of resources from your subjects.
Soren: Yeah.
Dave: Yeah. That's pretty [00:34:45] much it. Right. And it turns out democracy is most effective at maximizing extraction for [00:34:50] people. Provably. Yeah. And probably the least effective is having a king.
Soren: Yeah.
Dave: Right. Yeah. [00:34:55] But a benevolent king might be a better, a better situation effect. [00:35:00] Provably is in economics. Yeah. So maybe that's where [00:35:05] we're gonna end up, where the AI is our benevolent king.
If we teach it to be benevolent. [00:35:10] And representative democracy doesn't work because representatives are always corrupt.
Soren: Yeah. And I
Dave: don't care what party you [00:35:15] are. Like look at the history of, of everywhere. So if you could [00:35:20] have a. Democracy where you get to vote for what you care about, not for who you care about.
Yeah. [00:35:25] That would also change things. But you couldn't do that until now. Like, like you Yeah, we can conceive of it. Yeah. We're getting [00:35:30] close. So I'm, I'm hopeful, but it probably won't be a large country that does that. It'll be like [00:35:35] Lavia or, yeah,
Soren: initially. Initially, yeah. Initially. Initially somebody else will [00:35:40] probably be the model of it.
Mm-hmm. But if that's the winning if that's the winning option, and we see the [00:35:45] data that supports it. Every other country is gonna have to have to adapt just because it's just gonna be [00:35:50] impossible to not do that. And it's a little scary to give up, like these major [00:35:55] decisions to an artificial intelligence system that we're hoping has our best interest in [00:36:00] mind.
Um, but I think that's where it's going. I was talking, I have conversations with my [00:36:05] AI when I'm driving sometimes and, um, you know, I'll ask about, uh, new research in the [00:36:10] microbiome or all these things that I'm interested in, and we go back and forth. I'm like, I was talking to it and I was [00:36:15] like, Hey, could you send me that as an email as a PDF?
'cause I, I wanna kind of collect it. Yeah. And I was like, sure, I'll [00:36:20] send it to you. And then I, there was something else and the AI was like, sure. I'll send you that too in your [00:36:25] email, like confirming your email. Like, this is my email. I pull over and I haven't seen the email, and I was like, [00:36:30] I don't get, I don't have any email.
Dave: It doesn't have a hook in your email. Yeah.
Soren: And it says, I can't send you emails. [00:36:35] Like, what are you talking about? I you just, we just had a 30 minute conversation. And so, [00:36:40] so there's concern that we need to be aware that like, we're not quite there yet. [00:36:45] People pleasing is toxic
Dave: and teaching. That's where narcissism comes from, [00:36:50] guys.
Yeah. Yeah. My,
Soren: my, uh, so I was trying to be helpful, but it was like, doesn't have the [00:36:55] capacity to email me and I was like, I was like, yeah, that's just weird. So we have to be careful.
Dave: Mine was trying to get [00:37:00] me to install some Python libraries. I'm like, I don't wanna do that. Could you just give me a fully compiled thing?
It goes, oh, to do that, [00:37:05] I'm gonna have to wait to put it in Q to run it on a real Mac. I'm like, they can do that at [00:37:10] open ai. That, that's cool. And it never came. And I was like, wait a minute. I believe like, okay, that was [00:37:15] on me for being dumped, like,
Soren: but. We have to be, we just have to be careful because [00:37:20] there, there are hallucinations and, and um, when I was interviewing Sam a while back ago, he is like, well, [00:37:25] there'll, there'll be problems, but like, humans are fallible too, right?
Like, like it's not like humans are [00:37:30] perfect. Yeah. So the engines are, are also gonna have errors, but in general they're move more [00:37:35] and more towards correction of those areas faster sometimes than humans. So we'll see if that's the [00:37:40] case. But it's
Dave: like self-driving cars, right? Yeah. They, they still make mistakes.
They just make less mistakes than [00:37:45] people. But you're giving up control and
Soren: Yeah.
Dave: I'm okay to make a mistake when I'm in [00:37:50] control. Yeah. Not okay. When other people make a mistake or I get punched in the face. Right. Yeah. That's a, a [00:37:55] fundamental human thing.
Soren: Yeah. And my son, this is a while ago, he's 22 now, but when he was younger, he wanted to get [00:38:00] this cool new computer.
Mm-hmm. And generally I get him what he wants. I, he's the only son I grew [00:38:05] up with, four brothers and sisters. I generally didn't get what I wanted. Right, right. So, you know, as parents, we do this [00:38:10] thing and he was, um, trying to convince me to buy him this new computer and, uh. [00:38:15] He said, you know, dad, you've always been really kind and like loving father, and you [00:38:20] know, you've always kind of like supported me and I really want you to support me now and get this [00:38:25] computer.
So he is trying to manipulate me.
Dave: Did he ask AI to help him write that script? This is before
Soren: ai. Okay. This is before ai. [00:38:30] So he is trying to manipulate me. And, and so I was like pissed off at him at first, like, why are you trying to manipulate me, [00:38:35] like to get your computer? But then I shifted. I'm like, isn't he curious how he's decided to [00:38:40] manipulate me?
Mm-hmm. He manipulates me by saying, dad, you're always being a very kind and loving father [00:38:45] because he knows that's my identity. And that if he, if he gets into [00:38:50] my identity, then most likely I'll be like, oh, of course. Because I wanna uphold that identity. [00:38:55] Right? So I do think it's really interesting to notice how we get triggered.
And we also notice when somebody's [00:39:00] wanting to either manipulate us or insult us, how do they do it? 'cause somebody's saying, oh yeah Dave, you're [00:39:05] not very smart. Maybe that doesn't matter to somebody else, but it matters to you or Dave. You don't look very healthy. [00:39:10] Maybe that doesn't matter to somebody else.
You. And so there's all this information we get from our partners, our [00:39:15] kids, our friends, that if we're curious, it's just like gold in terms of mm-hmm. Where are we holding [00:39:20] onto this sense of identity and then feeling like we need to protect that identity. Yeah. Versus like, [00:39:25] wow, tell me more. What was, what is that?
Like, what do you see? Isn't that interesting? You see this? And [00:39:30] then we can shift that dimension. So I am, I'm totally with you. Like if we can bring a sense of loving and [00:39:35] curiosity to those moments. Then, um, that insult or that manipulation is [00:39:40] fascinating. And the ai, if you go on meta, you go on any of the Instagram, that is what the AI is doing for [00:39:45] them.
Mm-hmm.
They're trying to present to you everything they think will keep you on the platform. It's [00:39:50] all very much, it's all the algorithm is, is optimized for activity. The last thing they [00:39:55] want you to do is to step away from the, your phone so everything is optimized. So you can see [00:40:00] this is how we think we can get, you manipulate you mm-hmm.
To stay on your device and you can then [00:40:05] see, oh, it's this image, or it's this video, or it's this thing. It's a fascinating world.
Dave: It's [00:40:10] funny. My girlfriend is a top relationship expert. Uh, and we run, [00:40:15] uh, masterminds for entrepreneurs that work on tools Oh, awesome. For relationships and clear [00:40:20] communications.
And one of the things that, that we've been advising, uh, couples and singles [00:40:25] to do is work with ai Yeah. On clear communication.
Soren: Yeah.
Dave: So when you [00:40:30] wanna say that thing to your partner, and I've done that with my girlfriend, she's done it with me. Like, yeah. Know what? That didn't work for us, so Yeah, [00:40:35] yeah, yeah.
In fact, one time I was actually pissed, like, we don't have a lot of that kind of stuff. So I [00:40:40] talked for 20 minutes with, uh, the AI systems. I, I want you to like, organize all [00:40:45] my thoughts here. Yeah. So I vented a little bit and got it all down [00:40:50] and then had it craft like a, a relatively long text message.
Soren: Yeah. [00:40:55]
Dave: Um, that said what I wanna say in a nonviolent, et cetera, it was better than I could have done [00:41:00] Uhhuh with a therapist in a couple hours. Yeah. I mean, it was, it was 10 out of 10 communication. [00:41:05] And so I, I sent it and one of our arguments is that if there's stuff like that, the string will write it down [00:41:10] so we can talk about it.
Not, you know, avoidant with text messages, but you know what? That that's the [00:41:15] relationship improvements. Yeah. I'm a better communicator. Yeah. Yeah. And I think a lot of people are doing that now. Yeah.
Soren: Or [00:41:20] they're having a conversation with their partner and they're saying, will you listen in?
Speaker 3: Mm-hmm.
Soren: We're gonna have, you're recording, we're [00:41:25] recording this together live, and can you give us feedback on how we're communicating?
So you're having the [00:41:30] conversation, the ai, it doesn't use video yet, I don't think, but it, it will, it's using [00:41:35] voice and it can then say, Hey, I know this is my reflection. Have
Dave: you done that?
Soren: I've done that. Yeah. Was [00:41:40] it worthy? I've only done it. A few times. So you don't have data, but too much data. [00:41:45] Okay. But yeah, I mean, it, it can say, it sounded like your voice in increased at this point, Sorin.
[00:41:50] And I'm guessing that maybe that was triggering or hard for you to hear. Now the Gottman's and other people [00:41:55] are working also on their own versions. You've probably seen them. Yeah. That includes a video and it includes this whole [00:42:00] library of mm-hmm. Of material. And I think we're gonna move closer to that.
And, um, I'm [00:42:05] really super, actually excited for that area.
Dave: I am too. It, it may really improve human [00:42:10] relationships, which is, uh, which is really cool. So now
Soren: where we're gonna get to though, [00:42:15] is you'll have an ai. Mm-hmm. Your partner will have an AI and they'll work it out. [00:42:20] I, I think we, there's like both that we talked about it and here's what we've come to.
And the same with, with, [00:42:25] um, most issues or challenges is like Theis communicate and they come back [00:42:30] and they say, here's what we figured out.
Dave: Here's a company that I desperately want you to fund. [00:42:35] Um, and it it's an idea that, uh, I asked a friend to take to darpa
Soren: and we [00:42:40] should say, I'm a part of Wisdom Ventures, which is a venture capital company that funds in this fair, right.
In this [00:42:45] world.
Dave: Fair. And guys, I know I've repeatedly said that VCs are evil and I, I stand by that
Soren: face. [00:42:50] They are generally evil. They are generally evil. So here, the, the, the way that it's came about is [00:42:55] like it's better to light a candle than to complain about the darkness. And I was told a lot of [00:43:00] complaining about the nature of tech and addiction and mm-hmm.
And mental health. [00:43:05] And at some point I realized, you know, keeping complaining about something is not very helpful. [00:43:10] So about six of us got together, you know, some of them probably Young Pueblo and Jack Kornfield and other friends. Mm-hmm. [00:43:15] We said, let's build the venture capital company. This is the right people that has this human [00:43:20] flourishing as our primary focus.
And we've invested in about 35 different companies [00:43:25] from all the walks of life. And we'll see how they all do. So
Dave: if you have substantial [00:43:30] wealth and you invest it in the right things, then they grow. Yeah. And if you invested in things that [00:43:35] just are extractive Yeah. Then
Soren: they grow. All entrepreneurs know is that if they build the [00:43:40] next addictive iPhone app.
The money is is there for that. They're just gonna, they're smart. Yeah. They're [00:43:45] gonna build that. But if they know, oh, there's other money that has different priorities, hopefully the [00:43:50] entrepreneurs get excited and start building for where the money can support them. And so our [00:43:55] hope is that not just we exist, but there's hundreds of other VCs that shift.
And so [00:44:00] we do meditation with our founders love it. We do retreats with them. We, and, and it's so interesting because [00:44:05] we know in athletics that the good, the difference between a good athlete and a great [00:44:10] athlete is in part physical, but it's really mental, right? It's their quality of presence. It's their quality to not get too [00:44:15] pissed off when they miss a shot or their didn't get the call that they wanted and they stay in the game.[00:44:20]
The same is true with entrepreneurs. We can't guarantee success, but you increase the likelihood of success [00:44:25] because they're working on themselves and they're like learning about themselves. So we wanna be that [00:44:30] kind of a group where that optimizes for the wellbeing of the founder founders [00:44:35] so that we, and we know that the more optimized they are, the better leaders they're gonna be.
[00:44:40] It's so sorry. You were gonna pitch it. You were gonna suggest I'll, I'll show the idea.
Dave: I'm [00:44:45] just thinking all of my senior leadership, um, across my portfolio, and these are all operating [00:44:50] companies. Mm-hmm. And actually I'm invested about 25 others too. Oh wow. Okay. Mostly biohacking and, you know, [00:44:55] human flourishing.
Yeah. But when they, they start, they go through 40 years as end. This is my five [00:45:00] day brain upgrade. When you're done with five days, you have the same brain state as someone who spent 20 to 40 [00:45:05] years in daily meditation practice. Wow. And we're not perfect by a long shot, but man, the amount [00:45:10] of emotional awareness so we can have a conversation where if there's a problem, we talk about it without losing our minds.[00:45:15]
I think it's the best investment. I I, a hundred percent. Yeah. A hundred percent. So, so for a, a [00:45:20] venture firm to be doing group meditation stuff with your founders, I think that's how you make more money and you [00:45:25] have happier founders. Exactly. Exactly. Okay. Kudos to you for doing that. Yeah.
Soren: And I don't know why more people don't do it.
[00:45:30] Like, please everybody, please
Dave: copy
Soren: us.
Dave: Like all founders, to a cognitive enhancement and a [00:45:35] longevity and a consciousness program supported by their investors, because you'll make 10 times more [00:45:40] money. Like, it's just like, why would you want an unhealthy founder? It doesn't make any sense. Right. [00:45:45] Okay. Here's the idea.
Meditation creates one layer of [00:45:50] protection of your conscious sovereignty. Mm-hmm. We already know how to manipulate [00:45:55] people very effectively. Mm-hmm. Robert Cialdini's work. Mm-hmm. Uh, with his major book Influence [00:46:00] been on the show, narcissists are professionals at it. Yep. Marketing companies, [00:46:05] PR agencies.
Yeah. Government propagandists. These are proven and polished techniques [00:46:10] from manipulating your consciousness without your awareness.
Soren: Yeah. So, and that's getting better [00:46:15] and better. Oh, yeah. Yeah. Every day. Every day.[00:46:20]
Dave: So I would like to be present and not spend all of [00:46:25] my time stopping companies and other forces from trying to affect my consciousness.
Soren: Mm-hmm. [00:46:30]
Dave: I want an AI powered personal firewall that detects [00:46:35] any attempt to manipulate me. Mm-hmm. And then translates it into non [00:46:40] manipulative language. So I don't care if it's Washington Post or Fox News.
Soren: Yeah. [00:46:45]
Dave: That stuff is garbage in terms of its bias. Yeah. So I want it to remove bias and [00:46:50] I want to remove any of them manipulative techniques. Yeah. And [00:46:55] if I had that running on my phone, on my computer, [00:47:00] well it would just skip all of the dumb stuff that's coming through. Yeah. [00:47:05] In Facebook or Instagram or X or whatever, according to my rule set
Soren: Yeah.[00:47:10]
Dave: Versus theirs. Mm-hmm. Doesn't seem like it'd be that hard [00:47:15] given that we have agents right now.
Soren: Yeah.
Dave: Like the the personal cognitive firewall
Soren: Yeah. [00:47:20]
Dave: Is something that it would stop espionage techniques. Yeah. It would [00:47:25] stop manipulative techniques and I
Soren: would pay for that. Yeah. It's unlikely the platform [00:47:30] companies will build that themselves.
Oh no. They, it would be a separate, it would be a separate engine that would [00:47:35] be yeah, it would probably be, you'd probably have to have it on your phone and your computer. Yeah. So it can, [00:47:40] it can monitor whatever comes through and you have a choice to say, can you give me [00:47:45] this in as, as unfiltered away as possible, because it's still gonna be filtered through the ai.
[00:47:50] Right. But it can be based on your own value system, which is like, what's the true data here?
Mm-hmm.
[00:47:55] And, uh, I've never thought of that idea had of they share that idea. I, I think that's a, I think that's a great [00:48:00] idea.
Dave: It's time. And maybe that would drive. You [00:48:05] know, I think Elon would be the first one to do it, but why can't I just tune, um, [00:48:10] the algorithm of what I get to see?
Yeah. Like they're doing it for me without Yeah. I should have [00:48:15] full visibility to that. Exactly. A hundred percent. Right? Yeah. And since I don't, then I want a browser agent.
Soren: [00:48:20] Yeah.
Dave: That's running on my phone using ai, running on my computer. I don't care if they put stupid crap. Yeah. I'll [00:48:25] just keep deleting it and ignoring it until a good one comes up.
Soren: You know, one of the cool things you can do with any AI [00:48:30] agent, I, I mainly use touchy PT these days, but but Claude or any other ones are great too, is, um, you can [00:48:35] ask the ai what do you know about me that I might not see in myself? [00:48:40] Right. Or, what's the shadow you see in me? Ooh. That you, you might, that I might not [00:48:45] see in myself.
Brilliant. Brilliant. I mean, the more you, what did you learn from that? Well, the more you use [00:48:50] it, the more data it has, right? So if you just use it a few times, it's probably not gonna be the data. But so [00:48:55] Meta knows that and like a lot of the other engines know that. Mm-hmm. But we don't have access to that [00:49:00] level of sophistication.
I can't ask them, Hey, what do you guys know about me that keeps me on [00:49:05] and can I edit and adapt that? One of the ways, one of the things that was so true, it said about me. [00:49:10] I said, you know, you really care about the world and you're working on these different projects and you so consume [00:49:15] yourself with the projects, but you don't really take care of yourself as much.
I'm really concerned that that [00:49:20] is like an issue for you, which is totally true, and because it, it's working with me on all [00:49:25] these projects. Mm-hmm. Mm-hmm. Because I'm, I'm using it to like, support me, so it's, it knows that. [00:49:30] It knows, it knows that pattern. And it also talked about how, yeah, you wanna be seen as like the [00:49:35] good person, but you also need to really bring up conflicting and, and challenging pieces to [00:49:40] friends and family members.
Mm-hmm. Because you, you kind of see yourself as like helpful, but you also have [00:49:45] a hard time actually going through dis difficulty and discomfort. And you need to like learn how to have those [00:49:50] conversations too. And so I did it the other day and it, and it's, it's interesting to like, do it every [00:49:55] week and just see where, what you're telling it, because whatever you're telling it, it understands [00:50:00] way more than you think.
Mm-hmm.
Right? If I'm asking for relationship advice, like it's, oh, it's getting to know Sorin [00:50:05] and how he dances in relationships. If have a business advice, oh, it's getting to know me and it sees things that, [00:50:10] that, like, it's fascinating. So I think a regular dose of like, what am what do you see [00:50:15] in me that I may not see in myself is we're learning to, to grow with the ai, which [00:50:20] I think is fascinating.
Dave: I never thought of that as a weekly meditation awareness practice. I, I ran the [00:50:25] thing. And it knows a lot about me because I'm prolific. Nine books and all 1300 [00:50:30] podcasts. There's a lot of public information. It's mining plus our own interactions. [00:50:35] And it, the first time it came up it said, oh, you know, your, your longevity thing is just about fear [00:50:40] of death.
And, and I'm like, that's so boring. Like, haven't you heard me talk about death before? I have zero [00:50:45] fears about death. I just wanna do it at a time and by method of my choosing. But I was born I'll die. I'm [00:50:50] super comfortable. I didn't used to be, but I am uhhuh. And so it, it said, oh, well that's new data. Like [00:50:55] let me reframe this uhhuh in a way that was, was more interesting.
[00:51:00] And, uh, and it was very profoundly revealing. I don't remember all the details off the top of my head. [00:51:05] Similar thing. Yeah. But doing it every weekend
Soren: and doing it with a friend or a partner. Yeah. Also super [00:51:10] interesting. Let's do this together and let's see what it sees. And maybe we resonate with everything.
Maybe we don't [00:51:15] resonate. Mm-hmm. But like, let's just get that data set for us, I think is [00:51:20] super, super helpful. And I don't think people are thinking about ai mm-hmm. In that respect. [00:51:25] But the, the other engines that are trying to kind of manipulate us in other ways, [00:51:30] they're gathering that data, but they're not telling us how they're gathering that data, but they, they [00:51:35] have a little voodoo doll of you, right?
They're saying, all right, this is what we know about Dave. What can we put [00:51:40] in his feed that will increase his activity? I'm not saying that's bad or wrong, we just need to know as a [00:51:45] human, using these platforms, there's an AI behind the scenes trying to [00:51:50] optimize what it wants to optimize, which is activity.
And so for us to be conscious of that and to be aware [00:51:55] of that and to know that, and can we use that in a positive way, I think that's mm-hmm. That's the real gift.
Dave: [00:52:00] So if that prompt about what I not know about myself is, is the cake, [00:52:05] the frosting is what you do at the end with your partner and you tell your AI [00:52:10] system.
Would you describe me to pump me up the way my drunk best friend would [00:52:15] uhhuh?
Soren: Have
Dave: you ever tried that prompt?
Soren: No, I would try that. Oh my
Dave: God, it's so funny, right? [00:52:20] Yeah. Uh, I, I have never laughed harder than that one. And one of 'em, I even, I even posted it. It was so funny. [00:52:25] But doing that with a friend, it's like, oh, like that's the ridiculous side.
And then this is the dark [00:52:30] side. So you kind of get both polarities. That's a profoundly beneficial practice that you're the regular [00:52:35] thing.
Soren: Yeah.
Dave: I do wonder though, people are, they're sort of [00:52:40] saying, I'm outsourcing my therapist to this.
Soren: Yeah, yeah.
Dave: But we are, without our, [00:52:45] well, you might have conscious knowledge of this, but without our, our, our awareness [00:52:50] being attuned to it, our systems are co-regulating.
Yeah. And there's at least three [00:52:55] and maybe five signaling mechanisms that are not in our eyes and our conscious awareness [00:53:00] where they're kind of doing a dance. And if you go into a good therapist, you're [00:53:05] co-regulating your system with a therapist. Or if it's a psychedelic and they're holding space, which is an [00:53:10] actual thing, is it dangerous to try and co-regulate with ai?
Soren: You [00:53:15] know, I think it's a choice that we have to make. We're, if we are calling out to AI versus [00:53:20] calling our friend or calling our, our, our colleague, and we're trying to develop a relationship with [00:53:25] that. There's the danger for that for sure. And I think people have a choice to make. Do you wanna develop [00:53:30] a relationship with AI or do you wanna develop a relationship with other humans?
And can you somehow find the benefits of [00:53:35] both? Oh, I'm gonna talk to my a i about this because it can be helpful. I talk to my friend about this, and then my [00:53:40] friend and I develop this like deeper connection because we're going through this struggle together and [00:53:45] he's helping me, or she's helping me with this problem and they have compassion and I get to feel this human connection, [00:53:50] which is all beautiful as well.
Right? And so I think the, the optimal world for me is when [00:53:55] both of those have an equal part, I do worry about. People who are largely just having [00:54:00] communication with their AI and they have very few, if any, human friends. Yeah. Like, that I don't think is [00:54:05] healthy. I don't think that's a good thing. At the same time, you know, if we can dance between [00:54:10] those worlds, I think that's probably the optimum place that I want to be.
Because we know from study, I mean, human [00:54:15] connection and community is the one of the great secrets to life, right? Mm-hmm. Mm-hmm. [00:54:20] Like you wanna live longer, you wanna live happier, like find good friends and have a sense of [00:54:25] purpose.
Dave: Right. Good friends are pretty cheap compared to some of the gene therapy I've done.
Soren: Yeah, yeah, yeah, yeah, yeah. [00:54:30] Like it's the one thing we know is super helpful, so. Mm-hmm. So how does AI kind of have a place [00:54:35] but it doesn't try to take over? The enormous need for human [00:54:40] connection and friends. Mm-hmm. And, and I, you know, part of what we're doing with Ventures is trying to find those companies [00:54:45] that, that, that walk that line where, but they benefit from ai and now's like, we don't want [00:54:50] AI to be your best friend, but we want AI to teach you how to make a best friend.
Dave: Whoa. That's [00:54:55] exactly the perfect phrasing. There's a cognitive awareness and there's things you're [00:55:00] doing you don't know about. Yeah. And if you just knew the rules
Soren: Yeah.
Dave: And [00:55:05] really how to behave better, you probably have better friendships. Yeah. This is true. With or without ai, it's just [00:55:10] maybe better because you have a lot of wisdom in your new book.
I do. I [00:55:15] gotta ask, ask you this one. What's a lie? That sounds like wisdom but [00:55:20] isn't.
Soren: What's a lie? That sounds like wisdom. Is it? [00:55:25] You know, it's interesting. I, I'm, I'm gonna try to answer this in an interesting [00:55:30] way. I. I find that, um, one of the things I actually really have taken [00:55:35] one of the lessons from Trump mm-hmm.
Is that he's, and he said this in multiple interviews, he's like, it's not [00:55:40] what you say, it's the energy in which you say it.
Mm-hmm. Mm-hmm.
And so the lie, [00:55:45] sometimes I might say a word and a phrase, but I don't live that phrase. It's not a part of me. [00:55:50] Like Martin Luther King Jr. Can say, I have a dream. And it's all about love.
And it's very different than somebody else [00:55:55] says, who doesn't live that? As a practice. So I think there's an energy to our words, and so I [00:56:00] can say all kinds of energies. And I think the biggest question is, am I aligned with the words that I'm [00:56:05] saying? Is that embodied? Mm. Or is it just something I read on Instagram that I'm repeating [00:56:10] that has no real validity, it's in my larger experience.
Yeah. And when you meet people [00:56:15] who've experienced it because they've gone through the struggle of learning. Mm-hmm. It has a different [00:56:20] vibrational tone to it. And so the invitation I'd like to make everybody is just like, to what extent can we [00:56:25] align our words with our actions so that they have more power?
And [00:56:30] uh, that to me is the gift that we have. I didn't hear you say words and
Dave: actions. [00:56:35] I, and it could be, I'm just interpreting it through my own filter.
Soren: Yeah.
Dave: There's also [00:56:40] an interstate and outer state matching.
Soren: Yeah.
Dave: And. Um, in, [00:56:45] uh, in my most recent work, I, I talk about congruence and, and that's the word for [00:56:50] win, what you say and what you do matches your interstate.
Soren: Yeah.
Dave: Because we've all been in [00:56:55] meetings, um, usually board meetings with VCs on the board where someone [00:57:00] says something that pisses the founder off, right?
Soren: Mm-hmm.
Dave: And a good founder is like,
Soren: [00:57:05] yeah.
Dave: They, they to smile, they know how to behave. Yeah. And everyone in the room knows they're pissed. Yeah. But they have enough [00:57:10] self-control.
Yeah. So the lack of congruence
Soren: Yeah.
Dave: [00:57:15] It decide the entire outcome. 'cause everyone knows despite the behaviors
Soren: Yeah.
Dave: And when you can learn how to say, I know my [00:57:20] interstate matches that, and that was meant to piss me off, but it didn't.
Soren: Yeah.
Dave: So I'm gonna choose what I'm gonna do.
Soren: Yeah.
Dave: [00:57:25] Everyone knows it didn't land and then they change.
Yeah. Hundred percent. So my challenge in [00:57:30] leadership and in relationships and all has been how do I maintain congruence? Yeah. [00:57:35] Which appears to be, um, authenticity or integrity Yeah. At a, at a [00:57:40] deeper level than just. Just doing.
Soren: Yeah. Right. A hundred percent. And I think that's why people do [00:57:45] cold plunges and other things.
It's, the external can be harsh and difficult, but can you keep the internal [00:57:50] calm and focused, even though the external is harsh? Mm-hmm. And I think the best business leaders, the best [00:57:55] entrepreneurs, the best, kind of like people try to create positive force in the world. They know that the [00:58:00] external being difficult doesn't mean that the internal has to be difficult too.
And if we take that view that like [00:58:05] our only responsibility, our main responsibility is to a state of our consciousness, then that moment is a great [00:58:10] teacher and actually can build that team together. Right. If it's not resisted and it's [00:58:15] welcomed and invited and you can understand, it's like, oh, this isn't about what I say.
This is [00:58:20] about who I am. I like that.
Dave: One of the more fascinating things that you've [00:58:25] done is go into youth incarceration facilities and teach [00:58:30] meditation. Number one, locking up kids is evil, if I can just say that. Yeah. Yeah. [00:58:35] But I. What did you learn from this experience?
Soren: Yeah, so this is, um, I did a little bit in the Bay Area, but I mainly did New York [00:58:40] City Juvenile Hall.
So these were 14 to 16 year olds. Any infer, anything from [00:58:45] truancy to murder to attempted murder, to robbery. So it was a whole mix of people. Mm-hmm. And [00:58:50] I think I really wanted to go in because I had been meditating for some time and I started as a teenager and I [00:58:55] thought if this was helpful to me, it must be helpful for other teenagers.
Right. And where do I find teenagers who are [00:59:00] suffering, who maybe I can be of help to? And I thought juvenile halls is a great place [00:59:05] because I. They are, they're going through a big change. They're suffering immensely, [00:59:10] many of them, and they might be open to what I have to say. So I started in New York City Driven Halls, [00:59:15] and I just came in with a lot of, not know mine, don't know mine.
I was just like, let me see if this is [00:59:20] helpful to me. Let see if I can offer it. So the first thing I do is like, all right, let's all sit down. We're gonna do meditation. [00:59:25] Everyone close their eyes. And a kid raised no fucking way. I'm close to my eyes like, I'm part of this [00:59:30] gang. They're a part of that gang.
You think I'm gonna close my eyes in front of them? Like, this is no way this is working. [00:59:35] Yeah. No safety, right? No safety. Yeah. And so I had this sense of like, wow, I actually need to [00:59:40] adapt what I know to fit their world. So sometimes I do [00:59:45] this thing where I say, all right, everybody, I need you to grab your chair.
Can everyone grab their chair? They're like, I will grab your [00:59:50] chair. I was like, and the reason I need you to grab your chair is we're gonna do a meditation in a moment. And sometimes what [00:59:55] happens is people levitate off their chairs when they do the meditation and they hit their head on the ceiling. [01:00:00] And I don't want that to happen to anybody else.
And usually one kid will be like, there's [01:00:05] no fucking way I'm do this meditation. I don't wanna hit my head, my chair staying on the ground. I'm like, man, I'm playing with [01:00:10] you. I'm just playing with you. And like, if I could find that, we'd all laugh, right? And then we would [01:00:15] like have, I was like, perfect way to do meditation.
Now we're ready to meditate. Right? Like, just be yourself. [01:00:20] Be at ease. Um, but there's a story that I tell in the book of this one kid who would come to the [01:00:25] juvenile halls, and it was a voluntary class. He'd come every day or every week. It was a weekly [01:00:30] class. And, uh, he wouldn't do the meditation, he wouldn't do the yoga, he would just look around.
He was like [01:00:35] goof. He would goof around. I'm like getting frustrated with him over time because I'm like, why do you come to this voluntary [01:00:40] class? I teach you meditation. You don't do the meditation. I teach you yoga. You don't do the yoga, like what is your problem? [01:00:45] But he always gave me like a quick hug at the end of the class, and then it [01:00:50] hit me one day.
He didn't come for the meditation, he didn't come for the yoga. He came for the hug. [01:00:55] Wow. And so instead of me being resentful towards him or frustrated by him, I realized he was [01:01:00] sitting through a class he had no interest in to get a hug at the end. Wow. Why don't I [01:01:05] just give him a fucking hug and love him and be who he is and let him [01:01:10] go, rather than my agenda that here I am as this kind of teacher and they should take [01:01:15] what I think I have to offer.
When they're really just looking for some love. And um, [01:01:20] one of the other teachings I loved with the kids is they would just insult the hell out of people. And this happened to me and [01:01:25] other teachers when you come in, they would just be like, they would just go at you just like, I give you a fuck you skinny ass [01:01:30] white motherfucker, yet you come in here.
They're like trolls. Yeah. Yeah. And it's just the truth of their experience. 'cause they [01:01:35] haven't met a lot of white, skinny ass motherfuckers before. And so they see you and you represent or [01:01:40] I represent that whole world. Yeah. That they feel judged by. I represent [01:01:45] the white middle class, upper middle class world of Manhattan that they see.[01:01:50]
They feel judges them and walks across the street if they're walking down the street at the same [01:01:55] time and criticizes them. Mm-hmm. Energetically. And so then I come [01:02:00] up and of course all their judgments and hatred towards that group have to come through and they have to [01:02:05] insult the hell out of me. And then they get to see how I respond.
Do I get pissed off at them? Mm-hmm. [01:02:10] I'm like, you have no idea what you're talking about. I'm this good person. I'm coming here to help you. You should be thankful for [01:02:15] me. Or do I stop And like, yeah, tell me more. Tell me more about what you think of me. Mm-hmm. [01:02:20] And they would come up with all these judgements and all these criticisms that I didn't care.
I wore glasses at the time. They'd be [01:02:25] like, you wear glasses, you probably just do computer stuff, and you don't understand our world. You don't care about our world. [01:02:30] And if you could sit through that mm-hmm. And just hold steady, the kids [01:02:35] would then accept you. But everyone had to go through the initiation and the initiation.
So interesting to [01:02:40] insult, and I saw it as this great gift because in our world we don't do that as much, right? Mm-hmm. [01:02:45] We tend to play these roles and these identities like, oh, I'm mentor. You're a mentor. And it's not that it's [01:02:50] disingenuine, but it, it's it, we don't necessarily give us that honest feedback that the kids in juvenile [01:02:55] hall like would giving me this honest feedback.
So I, I miss them. I haven't worked in juvenile halls for a while [01:03:00] and, and they, um, they just were like enormous teachers for me.
Dave: One of the more [01:03:05] interesting experiences I've had was, uh, a friend invited me to go [01:03:10] do a plant medicine ceremony.
Soren: Hmm.
Dave: And I'm pretty sensitive about who I'll do that [01:03:15] stuff with.
Yeah. But it was like, okay, like I'll, I'll do it. And I know the two people leading are, [01:03:20] uh, jungle train Chas with the ability to hold space. So I show up [01:03:25] and pretty much everyone in the room except for me is a lesbian. [01:03:30] Mm-hmm. And a lot of these women have been lesbians for a long time and they really don't like tall white dudes [01:03:35] either.
Yeah.
Soren: Yeah.
Dave: And I'm like, let's go. Right, right. So we did did the medicine survey. We just [01:03:40] show up. It, it doesn't trigger me at all. And there was an incredible amount of healing [01:03:45] that happened there. 'cause I mean, I, we all hugged at the end and like a lot of the dark, angry [01:03:50] energy dissipated 'cause I'm like, it doesn't hurt me.
Yeah. You know, what you think of me is not a, is [01:03:55] not a me. I'm not saying I did all the healing. It was probably more of them, but it was actually, [01:04:00] I. Yeah, it was profound.
Soren: Like the, the next morning, if you can be that space Yeah. That's non-reactive. Yeah. It's, [01:04:05] and it's welcoming of whatever that projection is.
Yeah.
And or that pain is, [01:04:10] there's a healing that can take place.
Dave: One of the women, because it was a stay of the night kind of [01:04:15] thing, and one of the women had, or her bed roll there and like the couch is full sight pulled up near his bed. I'm saying. She goes, [01:04:20] she goes, but that's my bed roll. I said, yeah. Uh, and, and, and I sat on it and she [01:04:25] goes, I haven't had a man in my bed in 30 years.
Wow. And, and, and I said, okay. [01:04:30] And you could just see the cognitive reframe. We're like, it [01:04:35] doesn't matter. It's the thing. Yeah. And I wasn't trying to do anything to him being me, but I was a little intimidated. But
Soren: [01:04:40] that's the thing about allowing that discomfort and seeing, and I think that's the, the most biggest [01:04:45] challenge with parenting I find is we don't want our kids to suffer.
Mm-hmm. And we know that some amount of [01:04:50] suffering is the number one skill they need to build resilience and to build compassion. Is suffering [01:04:55] and learning how to suffer. And can we just be with them at times and be like, wow, yeah, that sounds hard. [01:05:00] And not try to fix it. And that's one of like, the biggest lessons I have is just can I just be [01:05:05] present and not try and fix it.
Mm-hmm. And just allow them to have that difficulty or allow me to have my [01:05:10] difficulty. And, uh, something opens, you know, when I'm able to do that,
Dave: I've [01:05:15] done neurofeedback with a, a few just top level spiritual gurus. And it's [01:05:20] always an honor when they're willing to talk with me or, you know, do some work with me.
Yeah. And [01:05:25] it's very common when people get to certain levels where like if I walk into a room and I [01:05:30] can see everyone's problem and I'm gonna start fixing, fixing, fixing, fixing, and at the end of it, it's like, [01:05:35] God, complex much.
Soren: Yeah.
Dave: Right. Where people have to ask [01:05:40] for that in order to do it Right. And to be able to sit [01:05:45] there and be like, I'm okay.
Yeah. You know, I, I'm at a, a sense of peace and in person there [01:05:50] is choosing to do something. Yeah. That. I don't think it's in their best interest, but it may be in their best interest. [01:05:55] And I don't know. It's like being available
Soren: Yeah.
Dave: For people to ask for help. Yeah. But not [01:06:00] forcing help on those who don't want it.
Soren: Yeah.
Dave: That's a nuance. And I, I see that in the nutrition [01:06:05] field and longevity field and the kind of the, the whole vegan versus keto, [01:06:10] it's, you know, you don't get sell people what to eat. Yeah. You see a lot of that, I'm sure. Yeah. And I mean, I make fun of [01:06:15] vegans all the time 'cause I was one, but it's not mean.
Right. It's like, it's [01:06:20] playful, right? Yeah. Because like we have the same values. Yeah. It's just that you're bad at exhibiting your values
Soren: and you know, I think [01:06:25] one of the things that we can invite people to do is when you go into a room where you meet people, like notice what are the [01:06:30] stories you tell?
Mm-hmm What are the stories you don't tell? Yeah. What are the things you have to feel like you must [01:06:35] do? In order to feel some hollowness you have inside of [01:06:40] yourself. Ooh, that's powerful. And just become curious, you know? And I think we live in similar [01:06:45] worlds where mm-hmm. I can get into gathering and stuff, and it's all like talking about, I don't [01:06:50] know how long we meditated or, or this is, or that experiences, which is all beautiful.[01:06:55]
But is that coming from a place of I don't know who I am and I need other people [01:07:00] to see me a certain way? Yes. Or is that coming from this just joy of telling stories and having fun? And I [01:07:05] think the curiosity we can have is just like, huh. Let me see, let me notice, just like the AI would do with this, [01:07:10] huh?
Let me see, where do I go to in moments where I'm, I'm, I'm discomfort, I'm [01:07:15] uncomfortable. What, what do I, what do I say? Or how do I act and can I become curious [01:07:20] about that? And, um. And I find that super, super useful.
Dave: That's cool. I like that [01:07:25] idea a lot. And you could probably amplify it if you have a recorder running for a couple days.
Yeah. Right.
Soren: Yeah. [01:07:30] If say, so here's the stories you generally tell you're uncomfortable, and then if you put in a wearable, the [01:07:35] wearable can actually, I mean, that's what I think is gonna be super, super exciting. But some wearables connect to the ai, they're, [01:07:40] well, you know, your blood pressure rose at this moment, and you said these words and, and, and I do [01:07:45] think we can use our own body as this incredible diagnostic.
Um, you don't need
Dave: any of the attachments. You really don't [01:07:50] attachments, but they help, you know what your body's doing. Yeah, a hundred percent. What's one practice you'd [01:07:55] be embarrassed to recommend? But it works.[01:08:00]
Soren: You know, I find the work of Joe Dispen is super cool and super interesting. I love Joe. [01:08:05] Yeah. And I'm not necessarily embarrassed, but like I love the sense of energetically aligning [01:08:10] with the future that I feel like is possible, that I'm mm-hmm. That feels like, how do we embody [01:08:15] that now? Hold the energy of stuff.
Right. So I think I'm sensitive sometimes to use [01:08:20] the word energy and to talk about energy. Mm-hmm. But I do feel like so much of life is actually energy [01:08:25] and can we tune into the energetics in ourselves and in another people without critic [01:08:30] criticism or judgment, but how do we enhance energy and move energy?
I think that's a [01:08:35] inquiry and a practice that I have that, um, that I think is super interesting. I also think sex and love [01:08:40] biking is just a place of like, so much curiosity and interest and expansion and like [01:08:45] love and surrender and all that world that happens. I think relationships [01:08:50] are just fucking hard and beautiful.
Mm-hmm. And there's so much to learn there [01:08:55] as a practice.
Dave: I'm happy you said that part. I, it, it was a bit of [01:09:00] a, an edge for me. In my heavily meditated book. There's a chapter on [01:09:05] tantra and conscious kink because people enter altered states. Yeah. Equivalent to [01:09:10] psychedelics from specific. Practices of intimacy.
Yeah. And there's nothing wrong [01:09:15] with that. Yeah. Yeah. But how could I write a book about how to get into altered states without talking about that one? Right. Yeah. [01:09:20]
Soren: And if you move energy in your body, things open mm-hmm. And they open, not just, they open [01:09:25] in all the different frames. Yes. And so why not use all those potential forms mm-hmm.[01:09:30]
For there to be understanding of where there's blockage and what frees that blockage. So I, [01:09:35] I think all those areas of life are just such, such, now it can be used in addictive way and it can be [01:09:40] used in a compulsive way, but it's used in a conscious way.
Dave: Yeah. Uh, we need to use everything we have. [01:09:45] I couldn't agree more.
It's probably not only fans
Soren: Yeah.
Dave: Or AI only fans. [01:09:50]
Soren: I, I don't, I hope we don't have AI far. We, we do already. I know, but like, [01:09:55] imagine waking up in the morning to a robot. Like I don't, that's just a weird world. [01:10:00] Yeah. I'd rather wake up to a house
Dave: full of cats and that would be my idea of hell. [01:10:05] We have time for one or two more questions.
So you just wrote a book, the [01:10:10] Essential.
Soren: Mm-hmm.
Dave: And what's one line from your book that still catches you off [01:10:15] guard?
Soren: Hmm. Oh, interesting question that catches me off guard. [01:10:20] Like when you see it, you go, wow. Can't believe I wrote that. So one of the things I did, which I, I kind [01:10:25] of, a lot of people do this, but Rick Rubin did it quite a bit, is that there you put little [01:10:30] phrases in between the chapters.
So there's chapters, but then there's little phrases that you put in there, just usually [01:10:35] like one or two lines. And, um, one of the ones that I put in that just kind of came to me, it was like, [01:10:40] we can never become enough. We can only wake up to who we really are. [01:10:45] Mm. And I think that one catches me.
Sometimes. We can never become enough. We can only [01:10:50] wake up to who we are, that we're trying to do enough in life to feel [01:10:55] enough.
Mm.
And what if inherently enoughness is our, our [01:11:00] expression, our true expression of humanity? And can we be that versus try to [01:11:05] get that, uh, and that to me is like a phrase that, uh, it just came out and, um, other people [01:11:10] probably said it the same thing before, but it just came out in my writing as I was tuning into what [01:11:15] wanted to be said.
Dave: Beautiful. I, I love that.
Soren: And I, I feel like with writing, and you might feel the same [01:11:20] way, it's not so much it, part of it is me writing. Mm-hmm. But part of it is something that's coming through me that [01:11:25] wants to be said, yeah. I'm channeling all this stuff. Yeah. Yeah. And so it's like for me to say it's my book, it's like, [01:11:30] yes.
And mm-hmm. There's a different source of intelligence. I mean, AI [01:11:35] helps a little bit. There's a whole other level of intelligence. Mm-hmm. And I think that's kind of part of the point of the book is [01:11:40] like AI's this beautiful intelligence, but we also have our own innate intelligence and how do we [01:11:45] harness and access that?
Dave: And learning how to do that. I, I think meditation [01:11:50] practices huge. Yeah. Uh, sometimes work with medicine. Yeah. Breath work. Um, and for me a lot of [01:11:55] neurofeedback and other Yeah. Just like sensory feedback from, you know, yeah. Whatever, heart rate variability, et [01:12:00] cetera. Uh, it's really helped me to, to be able to tune in and go, oh, if I can just [01:12:05] stop trying right now, I'll know what's right.
And if I try, it's gonna be a long night. Right.
Soren: [01:12:10] Yeah. As our trying a resistance to something that's happening now. Yeah. And can we welcome [01:12:15] that? Whatever that is. And then our trying comes from a different place. Mm-hmm. It's effort, but it's not [01:12:20] a forced or a tense effort. The effort is more of like a intelligent effort and I think that's the [01:12:25] dance that we need to play in.
Dave: There's an allowing and a surrendering. Um, if I'm gonna write something [01:12:30] really good.
Soren: Yeah.
Dave: Uh, whereas if I'm pushing it, man, it's, yeah, it's not as good. [01:12:35]
Soren: And I, what I'd find too, and I dunno if you notice this, whenever I'm trying a new endeavor, life wants to put me through an [01:12:40] initiation. There's things I have to let go of, there's things I have to understand.
There's some inner [01:12:45] process that says you have to do this, and then you can write the book. Mm-hmm. You know, or then you can start [01:12:50] the company or then you, but it's not so easy and we're gonna make you suffer just a little bit. Yeah. Yeah. But if you [01:12:55] get that learning, and, you know, for me, part of writing was just like, can I really write a book [01:13:00] from Joy?
Mm-hmm. And from not really carrying how the world receives it. And [01:13:05] if I can, it's a beautiful form of writing, but if I'm still trying to get the world to think of [01:13:10] me in a certain way, then it's tense and it's frustrating and it's not, it doesn't feel [01:13:15] the same way.
Dave: Have you, uh, read Rick Rubin's book The Creative Act?
Soren: Yeah.
Dave: [01:13:20] He writes about that really eloquently, and he really does. It's on the show, a, a, a couple times and. [01:13:25] Just a semi enlightened soul. Um mm-hmm. Very incredible human. I never quite [01:13:30] understood like, you know, why, why is he coming on my show? Like, why did we get to know each other when I read that book?
I'm like, oh
Soren: yeah.
Dave: [01:13:35] Because for him, art is just what you said there. I was like, you're doing it because you want to [01:13:40] do it not to please. Yeah. And I think he's Or it wants to do you. Yeah. There you go. Yeah.
Soren: Yeah. [01:13:45] You wanna do it and it wants to do you? Yeah. Yeah. I think Rick did an incredible job at, [01:13:50] at getting out of the way for writing to come through.
It took him seven years, I think, [01:13:55] to do that book. Yeah. And it was not an easy journey. It's a masterpiece. People wanted him to write about all these other [01:14:00] things and he is like, no, I'm gonna write about the inner work. And, uh, there's a, a [01:14:05] line in the Amazon page, I think, if I remember it. He's like, I, I. And [01:14:10] correct me if I'm wrong, but he said something like, I, I started writing a book about creativity and I [01:14:15] realized it wanted to be about how to be or something like that.
Yeah. And I think that comes [01:14:20] across where we have an intention, but then what does it want to have the expression to be? [01:14:25] And I think that's s dance. Mm-hmm. Like I have this idea, but the idea also has a [01:14:30] consciousness and has its own energy and own kind of like expression. And, um, [01:14:35] I think that that was one of the books that did the best job.
I told him in an email recently, it's like, [01:14:40] you did the best job writing about something that can't be written about.
Dave: Yeah. I was blown away. And [01:14:45] I, when I first said that like this, 90% of the songs I've ever loved have that [01:14:50] Rick as, as a connection. He's a, a, a modern incarnation of a muse. [01:14:55] Yeah. And I, I would love to sometime ask, you know, an AI engine that's well [01:15:00] trained on everything.
Like what are the common elements? Like what, what made. [01:15:05] Rick. Rick.
Soren: Yeah.
Dave: And, um, he may have already done that or may he may reject it. [01:15:10] I'd have to ask him. But just there's, there's so much magic Yeah. About the unseen [01:15:15] and unconscious that, that are essential and
Soren: Yeah.
Dave: And we don't know [01:15:20] what they are. Yeah.
But we know they're essential. And I love that you chose that as the title for your book.
Soren: You know, so it's weird 'cause I've written a [01:15:25] few other books and sometimes the, the title's so hard. Yeah. It's just like, what is it? This or that? And so I was sending a [01:15:30] draft, a friend, literally in two seconds. I was like, I need to put a title.
Mm-hmm. [01:15:35] And it was like the essential, I was like, oh, thank you God, universe. Yeah. It's like the right thing. It felt [01:15:40] right. Mm-hmm. And I think that's, can we remember what's essential? Right. And I'm not gonna tell [01:15:45] you what is essential, but I can tell you that it's important to ask that question about what's essential because [01:15:50] that's your North Star.
Mm-hmm. And a life where you know what's essential and you meet people [01:15:55] aligned with that is a very different life than a life where you don't know really what matters to you. And you're just [01:16:00] lost in this world or that world without any sense of direction. And I feel like. The question of [01:16:05] what actually matters to me.
What is essential then becomes our North star and we can guide our [01:16:10] life appropriately.
Dave: Perfect. Very well said. And your book comes out May [01:16:15] 20th. It does Same day as. Oh, cool. So for our listeners [01:16:20] guys, I have a challenge for you go by the essential, order it now [01:16:25] and get heavily meditated at the same time.
Yes. 'cause these two books belong together. If you just listen to the whole interview, [01:16:30] you, you know that there's so much interaction between where we both are mentally [01:16:35] and spiritually. Anya, what do we do now? And
Soren: yeah.
Dave: I'm I'm very excited [01:16:40] about, um, all of the, the pondering and thinking and just the perspectives that you have, [01:16:45] especially 'cause you're plugged in, in tech in a way.
Mm-hmm. Kind of plugged in, but. Yeah, no, I'm plugged into my own wearables [01:16:50] that I, I, so,
Soren: well, I appreciate it, and I thank you for your work and thank you for your dedication. You know, it's interesting to meet [01:16:55] people who are kind of newer on the scene. We were, you were on the scene. I was on the scene long before it [01:17:00] was cool.
Yeah. And we were nerds. Yeah. Yeah. And there's a certain, like companionship and friendship I [01:17:05] think, uh, that can, you can have, we realized like, oh, this person's been dedicated their whole life to, [01:17:10] to these elements. Mm-hmm. And, uh, I feel that, you know, with you that the [01:17:15] dedication has been long and, and both through good times and hard times.
And so it's, uh, [01:17:20] really an honor to,
Dave: it's all about the human flourishing, my friend.
Soren: It's all about human flourishing, and it's about understanding [01:17:25] what's inside and how do we bring what's inside into the world of form. And a way that feels [01:17:30] genuine to us, and it feels like a participating in the human field and the advancement of [01:17:35] a field that we can create together that nobody can take ownership of.
Mm-hmm. But that, that [01:17:40] is integral to all of us. Yes. And we each have a, a part to play. Yeah. [01:17:45]
Dave: See you next time on the Human Upgrade [01:17:50] Podcast.