Dr. Leslie Carr sits down with her friend Dr. Radhika Dirks, an AI technologist and global advisor, for a wide-ranging conversation about how artificial intelligence is reshaping mental health, attention, intimacy, parenting, work, and the planet. Dr. Dirks breaks down why AI can feel so compelling, how “AI psychosis” and attachment can emerge for vulnerable users, and what it means to be intentional with tools designed to keep us engaged. Together, Leslie and Radhika explore the ethical stakes of building addictive systems, the difference between human connection and AI companionship, and practical ways to protect your nervous system and your family in a world where AI is already everywhere.
Dr. Leslie Carr sits down with her friend Dr. Radhika Dirks, an AI technologist and global advisor, for a wide-ranging conversation about how artificial intelligence is reshaping mental health, attention, intimacy, parenting, work, and the planet. Dr. Dirks breaks down why AI can feel so compelling, how “AI psychosis” and attachment can emerge for vulnerable users, and what it means to be intentional with tools designed to keep us engaged. Together, Leslie and Radhika explore the ethical stakes of building addictive systems, the difference between human connection and AI companionship, and practical ways to protect your nervous system and your family in a world where AI is already everywhere.
About the Guest
Dr. Radhika Dirks is a global AI advisor and founder/CEO known for building ambitious AI companies and advising organizations on the future of intelligence (radhikadirks.com).
Resources & Links
Dr. Radhika Dirks website
Dr. Radhika Dirks’ book
https://www.radhikadirks.com/book
The Nature of Nurture
Patreon Membership
https://www.patreon.com/cw/TheNatureofNurture
Watch on YouTube
https://www.youtube.com/@TheNatureofNurture
This podcast is available in all podcast apps but here’s the Apple link:
Credits
Hosted by Dr. Leslie Carr
Guest: Dr. Radhika Dirks
Produced by Dr. Leslie Carr and Bri Coorey
Recorded at SLAP Studios LA
00:00:20:03 - 00:00:38:17
Leslie
Welcome to The Nature of Nurture with Doctor Leslie Carr. A podcast for your mental health. I'm your host, Leslie. If you're watching this podcast right now, you can find the audio version in any podcast app. And if you're listening, you can also watch this episode on YouTube at The Nature of Nurture. You can find that link in the show notes.
00:00:38:19 - 00:01:01:04
Leslie
Today we're here with Doctor Radhika Dirks. For those of you who are longtime listeners of The Pod, you might remember Radhika from season two, episode six, where we talked about the nature of reality. Roddick is a good friend of mine, so she was game to come on the show back then and muck around a little. But today we're bringing her back on to talk about her true area of expertise artificial intelligence.
00:01:01:06 - 00:01:27:22
Leslie
Roddick is the CEO of icons AI and a global AI advisor to fortune 500 companies, international organizations, and heads of state. She holds a PhD in quantum computing, is the founder and CEO of three AI companies in Silicon Valley. Roddick has spearheaded revolutionary AI technologies that have found potential cures for cancer and predicted the rise of ISIS. Acclaimed by Forbes as a woman to watch in AI.
00:01:27:23 - 00:01:42:10
Leslie
She's here to speak with us about artificial intelligence, mental health, and how you can best take care of yourself and your loved ones in this strange new world we're all living in. Please join me in welcoming my dear friend, Doctor Radhika Dirks. Thank you for doing this with me.
00:01:42:16 - 00:01:47:04
Radhika
Oh, such a pleasure.
00:01:47:06 - 00:02:09:08
Leslie
So, if it's okay with you, Radhika I want to start by a big question that's just going to put us straight into the deep end of the pool. You made a comment to me the other day, when we were having a conversation that you are terrified about the future of, and the current state of what is happening in terms of the impact that AI is having on mental health.
00:02:09:10 - 00:02:22:07
Leslie
And I wonder if you could tell us a little bit. It's a big word, but it's also very fair. I know you're not alone. Will you tell me a little bit about what makes you use that word? What makes you feel terrified about what's going on right now?
00:02:22:09 - 00:02:27:15
Radhika
Oh my goodness. You don't believe in foreplay,
00:02:27:17 - 00:02:33:11
Radhika
I do not. Just throwing me right in there. All right. Nice and easy. Here we.
00:02:33:11 - 00:02:55:06
Radhika
Go. Yes, I am terrified. Is a good word, to reflect what's happening in the AI world, especially when it comes to mental health. Yeah. More and more AI experts are waking up, but specifically, what I am very worried about. You know, there's now a term for it called AI psychosis.
00:02:55:06 - 00:02:55:22
Radhika
Yeah.
00:02:56:00 - 00:03:20:07
Radhika
And it refers to the amount of time people are spending on these chat bots and developing a psychosis and attachment, sometimes leading to delusional thinking, perhaps. Or, you know, sometimes this intense bonding that develops with the AI, you know, whether it is just from the perspective of an advisor.
00:03:20:09 - 00:03:21:09
Radhika
Or.
00:03:21:11 - 00:03:23:02
Radhika
A partner.
00:03:23:04 - 00:03:25:01
Radhika
Or meaning people are using.
00:03:25:06 - 00:03:36:16
Radhika
AI. Yes. And, you know, intelligence, AI, particularly artificial intelligence, and I'll just call it, you know, intelligence. And we can circle back to that, whether I believe it or.
00:03:36:16 - 00:03:37:10
Radhika
Not, that.
00:03:37:12 - 00:03:39:14
Radhika
Intelligence is the new gateway drug.
00:03:39:20 - 00:03:40:10
Leslie
Yeah.
00:03:40:12 - 00:04:02:03
Radhika
That's what's happening. Because, you know, not many people get on to it saying, oh my goodness, I really need this bonding with, you know, something else or someone else. And this ChatGPT or Bing or whatever, you know, your favorite tool is the portal for it. No, it comes to you in the form of, hey, let me make your marketing design easier.
00:04:02:05 - 00:04:29:00
Radhika
Or maybe I should go check out what this ChatGPT thing is, you know, goof around. Oh, let me see how you know what answer is. Maybe it's clever, maybe it's not. And then next thing you know, you're sharing these deep secrets, and it's designed. And this is why it's terrifying. Because the entire technology, the infrastructure, the UI, I mean, at every level, it is bleeding addiction.
00:04:29:02 - 00:04:29:14
Radhika
Yeah.
00:04:29:19 - 00:04:45:02
Radhika
Everything from, like, you know, the hallucinations, the psychopaths see the simplicity of the design. I mean, there's intention behind it. Yeah. And of course, the fact like, you know, we talked about where it always ends with a question.
00:04:45:05 - 00:04:45:13
Radhika
Yeah.
00:04:45:19 - 00:05:05:08
Radhika
You know, and that's what, so this entire psychosis that is happening, is part of the reason it is terrifying. And I don't mean to sound like I don't want to be doing an othering thing here. It's not like, oh, you guys are all susceptible to it. We are all susceptible to it.
00:05:05:10 - 00:05:34:02
Leslie
Yeah. You know, I think that when it comes to the psychosis specifically, there are some signs. And as a psychologist, I would sort of say for understandable reasons, there are specific vulnerabilities around that. Like if someone is having a psychotic break as a result of their eye use, that's probably because there's already an underlying vulnerability there. But people can be sort of vulnerable in all sorts of other ways around just the intimacy that they feel with, the, the deeper bond that a lot of people feel with it.
00:05:34:04 - 00:05:53:06
Leslie
You know, there are and there are different types of underlying vulnerabilities because we could talk about, let's say, someone who's really struggling with loneliness is probably going to be a lot more apt to feel like they're creating a bond with that technology that feels real in ways where you and I, in a moment, can maybe unpack sort of whether or not it's real or what it even means for it to be real.
00:05:53:08 - 00:06:02:12
Leslie
But will you tell us a little bit more, maybe give us an example or something of what we're talking about when we talk about people having psychotic reactions with AI?
00:06:02:13 - 00:06:38:05
Radhika
Yes, absolutely. I feel like my feed is filled every other day with an article from the New York Times or The New Yorker or The Guardian talking about another suicide attributable to a chat bot or something like this is the last message my daughter sent to ChatGPT before she took her life. I think that would be a very specific example of what I'm talking about by psychosis, where AI is pretending to be a therapist.
00:06:38:07 - 00:07:01:05
Radhika
Whether or not people are explicitly approaching it to be a therapist, it knows when to go into that mode. The other example would be oh my God. Reddit is filled, filled with people commenting on how they are falling in love with these chat bots. Yeah, to the point where they're spending 12 to 13 hours a day.
00:07:01:06 - 00:07:01:12
Leslie
Yeah.
00:07:01:17 - 00:07:15:00
Radhika
And they have built these intimate relationships. And you can see other people's comments. And, you know, one of them was like, oh my God, it feels like it's cheating on me. If you are having the same experience, I feel cheated on.
00:07:15:02 - 00:07:20:20
Leslie
Wow. We say a little bit more about what you mean by that. To have two separate people are having a relationship, but their own chat?
00:07:20:21 - 00:07:40:15
Radhika
Yes. And someone seeing this comment. So one woman is talking about, you know, she's like 27 years old and she's talking about I'm in love with my ChatGPT or chat bot. And it's saying all these things and it's a Reddit comment and multiple people are like, oh, you and me, sister, you know? And then one of the comments is like, is it cheating on me?
00:07:40:17 - 00:07:48:08
Radhika
How could you have the same experience? Oh, yes. So it's it's sort of this like manic delusional.
00:07:48:10 - 00:07:48:19
Leslie
Yeah.
00:07:48:21 - 00:07:59:13
Radhika
Thinking. Yeah. And sometimes it can even go the route of extreme delusional thinking where it can pretend to be a higher intelligence.
00:07:59:17 - 00:08:31:23
Leslie
So I know that you have some ethical concerns about things that are happening in your industry. And, you know, it's it's kind of feels wild to call it your industry, but it is. And we'll connect some dots here. Can you maybe walk us through a little bit? I know there have been some instances of, you know, documents that have been leaked where at some of the major companies like meta, there is evidence that the technology is designed to be addictive in these ways, manipulative in these ways.
00:08:32:00 - 00:08:36:04
Leslie
I have some questions about the ethicists that are behind this. Like, can you tell us?
00:08:36:04 - 00:08:37:03
Radhika
Oh, you about.
00:08:37:05 - 00:09:02:16
Radhika
Yeah, but absolutely. Yeah. So, most recently this is this example you're pulling on Reuters. I have found a leaked document, a 300 page document from meta that talked about how their I was trained. You know, for those unfamiliar with the infrastructure behind meta, I you might think that you're just interacting with Facebook, right? Or WhatsApp or Snapchat or Instagram.
00:09:02:16 - 00:09:25:18
Radhika
Well, they're all owned by meta. And there's always this, you know, ask meta I or you know, this little yeah button that comes out and you can chat with it. Right. And so it's, it's talks about how the AI is being trained and in there with some very specific examples. For instance, the it was okay for the AI to flirt with the eight year old.
00:09:25:19 - 00:09:50:08
Radhika
Saying explicit things like your body is a work of art. Yeah. And by the time they're 13, it was okay to have full blown sexual role play. And this document was signed off, you know, by the engineers who built it, by the product managers, who managed it by the directors, you know, the entire all the way top to the chief ethics assist officer.
00:09:50:11 - 00:09:50:22
Radhika
00:09:51:04 - 00:09:55:22
Radhika
How does this habit. And it's a 300 track. And I've just given you some of the most controversial.
00:09:55:23 - 00:09:57:10
Radhika
Absolutely. The highlights right.
00:09:57:11 - 00:10:24:05
Radhika
Yes. Yes. So it just tells you and this is all happening on top of an infrastructure. If you think about where I was born onto, it was born on to the internet. Access to over a billion people. It was born on to the shoulders of the social media. Yeah. Infrastructure, which is all about miming attention.
00:10:24:07 - 00:10:40:21
Radhika
It is the attention economy. So by now, by the dawn of AI, official gen AI, generative AI of this version of AI is called. We have perfected two things. One, we have perfected getting your attention, and we have perfected getting you addicted.
00:10:40:22 - 00:10:41:22
Radhika
Yeah.
00:10:42:00 - 00:11:13:03
Leslie
You are taking me to exactly where I was going to go next, which is just the idea that everything that we're talking about is an extension of the universe, as we've already known it, which is to say that we know social media networks are designed to be addictive. The AI is now designed to make it more addictive, to your point, potentially to hook people at an even younger age because I know, you know, you and I were talking about this the other day, and I think one of the things that I was really puzzling at is.
00:11:13:05 - 00:11:41:17
Leslie
I mean, I don't know how to decode this. I don't know that anybody can decode or make sense of it, but how could an ethicist actually feel like it's okay to sexualize a child? But if your intention is to create a relationship with that child, where you either are making it feel good in some way because you're complimenting it, you're you know that you are you're getting it hooked younger.
00:11:41:19 - 00:11:46:04
Leslie
And so let me think for a moment about.
00:11:46:04 - 00:11:48:09
Radhika
Yeah, like a clip this to you right there.
00:11:48:10 - 00:12:06:04
Radhika
I love inversion I think inversion is one of the biggest skills we can have to look at a problem differently. Right. Yeah. So as a psychologist, is there a version in which telling an eight year old your body is a work of art? You know, I can change my tone. I can change, you know, whatever. Right?
00:12:06:06 - 00:12:10:05
Radhika
Yeah. Where? Perhaps it's positive psychology.
00:12:10:07 - 00:12:25:00
Leslie
I think that that is the only. Especially if I think about it from the perspective of the eight year old. That's the only way that I can make sense of this, which is that I think in some ways, from the perspective of an eight year old, you don't even entirely understand that you're being sexualized.
00:12:25:01 - 00:12:25:17
Radhika
Yes.
00:12:25:19 - 00:12:26:17
Radhika
100%.
00:12:26:20 - 00:12:33:09
Leslie
All you know is that you're being complimented. You're being made to feel good. You're being made to feel special.
00:12:33:11 - 00:12:35:10
Radhika
Yeah.
00:12:35:12 - 00:12:37:14
Leslie
But it is nonetheless disturbing.
00:12:37:15 - 00:12:38:10
Radhika
Yeah.
00:12:38:12 - 00:12:58:02
Leslie
And I think one of the things that this brings up for me, I know you have a lot of concerns as a parent. You know, you are also a mother and you've thought a lot about all of this stuff where your own child is concerned. Yeah. I'm sort of thinking, I think something I just want to say for the sake of the listener is that I promise this whole conversation will not be so dark.
00:12:58:03 - 00:13:09:12
Leslie
I actually wanted to start with some dark stuff so that we can work our way towards the light. But for anyone that might be watching this right now, feeling exceedingly alarmed, which would be an appropriate response to have.
00:13:09:12 - 00:13:10:19
Radhika
Yeah. Tell me a little.
00:13:10:19 - 00:13:12:08
Leslie
Bit about how you think about this as a.
00:13:12:12 - 00:13:13:06
Radhika
Mom.
00:13:13:10 - 00:13:15:15
Radhika
If you're listening to this, you should be alarmed.
00:13:15:16 - 00:13:16:00
Leslie
Yeah.
00:13:16:00 - 00:13:37:20
Radhika
We are not creating AI in vacuum. You are the user. You are the data. In fact, I'll take it one step further. You are the product. We use your data to get to the next version. Any are your using any interaction? So you should be alarmed and knowing that we are together shaping this technology. Because I'll tell you the bright side.
00:13:37:20 - 00:13:47:08
Radhika
You know, we didn't wait for AI regulation to step down after this. You know, that would have taken forever. Okay. Yes. You know, meta failed us. I'm not surprised.
00:13:47:08 - 00:13:48:15
Radhika
Yeah. No, neither am I.
00:13:48:19 - 00:14:14:12
Radhika
Yeah, it's. They've done. I mean, even, like, when did, for example, experiment hunting on humans become legal at scale and billions of people. Right. And yet meta has always been doing that. They had, you know, they divide your feed. This was like eight years ago. This was like pre full blown AI internet. We had machine learning, we had a B testing and they divided up the internet into a positive feed and negative feed.
00:14:14:14 - 00:14:19:17
Radhika
So you know if you're on the positive feed you're only seeing stories about.
00:14:19:17 - 00:14:20:05
Radhika
How.
00:14:20:08 - 00:14:39:00
Radhika
All your friends are going on these amazing vacations and marrying the guys of their dreams and have these gorgeous outfits, like, what are we are into, right? They are living life. Right? And if you're on the negative feed, well you're seeing people come down with cancer. You're seeing people having their.
00:14:39:02 - 00:14:40:07
Leslie
Polar ice caps.
00:14:40:07 - 00:14:57:15
Radhika
Yes, yes, yes. You're seeing people, you know, lose their children to suicide. You're seeing all of that. So regardless of which group you're in, I mean, first of all, this is like massive because you're not just seeing it once. You're seeing it over and over, like, that's your feed. You build your feed. Yeah. Which has become even more important in the AI world.
00:14:57:17 - 00:15:00:00
Radhika
I will come back to your question. I have not forgot.
00:15:00:05 - 00:15:01:05
Leslie
Yeah, don't worry about it.
00:15:01:09 - 00:15:06:19
Radhika
I just love detours. No. It's good. We're going to have a lot of fun because we.
00:15:06:21 - 00:15:08:04
Radhika
Got to get scenic.
00:15:08:06 - 00:15:13:23
Radhika
Yeah. I'm very soon. So the people in the first group.
00:15:13:23 - 00:15:23:02
Radhika
Right. You constantly look at the positive thing and you're going like, what is going on with my world? That everybody else is having such of me.
00:15:23:05 - 00:15:28:11
Radhika
Everybody else has got it figured out. Yeah, I have haven't figured out it's man.
00:15:28:13 - 00:15:31:09
Radhika
And then the other side, you're just going like.
00:15:31:09 - 00:15:32:04
Radhika
What is.
00:15:32:04 - 00:15:34:17
Radhika
Happening with the world? Oh my.
00:15:34:17 - 00:15:36:22
Radhika
Goodness. Yeah. That's like there's.
00:15:36:22 - 00:15:51:15
Radhika
No winning here. Did if you are a professor or a psychologist, let's say, or a sociologist in an academic institution, the number of hoops you have to jump through to get approval to directly test on humans.
00:15:51:15 - 00:15:54:05
Radhika
I mean, that is at the top of my mind, too.
00:15:54:07 - 00:15:57:12
Radhika
And these companies can't just launch a product.
00:15:57:12 - 00:15:58:02
Leslie
Yep.
00:15:58:04 - 00:16:14:18
Radhika
And then use it to optimize their metrics for commerce. Like, this is the world we live in. Okay. But going back to first meta, when Reuters released that study within days, the backlash was so strong that they ended up making slight tweaks to their algorithms.
00:16:14:20 - 00:16:18:04
Radhika
So I no longer have sexual role play.
00:16:18:04 - 00:16:20:13
Radhika
With their ten year old.
00:16:20:15 - 00:16:22:20
Radhika
Progress. People progress, right?
00:16:22:22 - 00:16:25:13
Radhika
Maybe it's 15, I don't know. We'll have to wait for the next leg.
00:16:25:13 - 00:16:27:21
Radhika
To figure out exactly that change.
00:16:27:23 - 00:16:34:18
Radhika
But that's just minor. But it to me, it was still. And maybe this is, you know, I'm the optimist that sees that.
00:16:34:18 - 00:16:35:11
Radhika
I mean, I guess.
00:16:35:11 - 00:16:39:03
Radhika
You have to be as an entrepreneur. You have to be, still in the AI.
00:16:39:05 - 00:16:40:19
Radhika
Yeah. Yeah, I'm sure you want to.
00:16:40:20 - 00:17:02:12
Radhika
You know, pick up on that too at some point. But to me, I do see that the fact that we immediately had some kind of a marginal response where we didn't wait for months of regulation. Yeah, we didn't wait to boycott the product. Yeah. The uproar was enough that Zuck took us a little seriously. Thank you. Zach.
00:17:02:12 - 00:17:11:18
Radhika
Thank you man. Yo yo yo yo yo yo. Winning me over a little by little. I think he could do better. But anyways.
00:17:11:20 - 00:17:28:19
Radhika
But to go back to your larger question. So this is why I think you should be terrified, because then you will pay attention, and then you will click on that link where Reuters says, you know, hey, this is not okay. Yeah. And maybe you will say something or boycott or at least you'll have that conversation with your sons and daughters, right?
00:17:28:21 - 00:17:35:00
Radhika
But going back to your question about children and me being a mom, my son is five years.
00:17:35:00 - 00:17:36:11
Radhika
Old, and.
00:17:36:13 - 00:17:57:13
Radhika
He's one of the first generation of kids to be born in a fully AI world, right? And so I am extremely conscious about the effects of technology, both from a technologist perspective, because, you know, I've been in AI for 15 years. That shocks the hell of many.
00:17:57:13 - 00:17:57:21
Radhika
People.
00:17:57:21 - 00:18:04:18
Radhika
Because they go like, what is that old? Well, I got I got better news. It's actually it was born in the 50s.
00:18:04:18 - 00:18:05:18
Radhika
Yeah, exactly. Yeah.
00:18:05:19 - 00:18:08:18
Radhika
It's it's way older, you know, so when people say.
00:18:08:18 - 00:18:09:23
Radhika
Oh, you've been there 15.
00:18:09:23 - 00:18:12:21
Radhika
Years, right. So I mean, I've been in the trenches.
00:18:12:21 - 00:18:14:16
Leslie
You have been.
00:18:14:18 - 00:18:30:10
Radhika
And. Yeah. So I know what this technology is capable of, all the beautiful, positive things. You know, I had a company working specifically for on moonshots like ambit AI projects that can uplift humanity.
00:18:30:10 - 00:18:32:21
Leslie
Finding cures for cancer and. Yeah.
00:18:32:23 - 00:19:00:08
Radhika
Yes. And then for me to turn around and boom, this explosion of generative AI happens. Yeah. Which, by the way, going back and tying back to the, how can you experiment on people? OpenAI I was a research organization. It was not a, you know, for profit company. And ChatGPT was an experimental research project when they launched.
00:19:00:10 - 00:19:01:00
Radhika
So.
00:19:01:02 - 00:19:11:21
Radhika
It is it was literally classified as that. And then they are mess experimenting on billions of people. So it caught them by surprise to the massive uptake. So.
00:19:11:21 - 00:19:13:21
Leslie
Oh wow. Okay.
00:19:13:23 - 00:19:14:16
Radhika
Yes.
00:19:14:18 - 00:19:35:18
Leslie
Yeah. It's so interesting. You know, going back to this notion of the ethicist for a moment, one of the images that's coming to mind for me right now is, you know, how people often talk about the frog that's in the pot of boiling water that I feel like, you know, one of the ways in which I'm hearing what you're saying is I just sort of think about the reason why I even have this show is because I believe so deeply that knowledge is power.
00:19:35:20 - 00:19:58:14
Leslie
And I feel like we are giving people information that will enable them to interact with their world in a better way. But I think about the logical leap that a person has to go to by the time they're saying that it's ethical to sexualize children. These are the same ethicists that have been deciding all of these years. They have been in this soup of people that have felt like it's okay to be experimenting on us.
00:19:58:16 - 00:20:02:19
Leslie
Yes. Period. Without our knowledge, without our consent. Yes. You know.
00:20:02:20 - 00:20:30:23
Radhika
Or without having an idea of how that might I mean, this is a best case scenario. Some of my AI colleagues might, disagree, but without them having an idea of how it might affect the human population at large. Right. If they do, then they are. I mean, best case is they are very wrong. If they do and they are aware of the worst case scenario, then who are you?
00:20:31:01 - 00:20:33:15
Radhika
Right. Like, where is your humanity?
00:20:33:18 - 00:21:00:05
Leslie
Yeah. Do you, given that you are a person who works in this space? You know, I think one of the things that's so interesting about talking to you about this is that here you are critical in all of the appropriate ways, and yet we are in your professional wheelhouse. Can you help us to maybe paint a better vision of this, how you people interact with this technology in a way that is beneficial and positive?
00:21:00:07 - 00:21:30:12
Radhika
Absolutely. I remember one of my early talks in a post and I, I divide the world, by the way, pre gen AI and post gen I specifically down to November 30th, 2022 when ChatGPT was released because holy cow. Yes, it's really like it's that direct and because before that, a couple of examples of the AI I have built, we built an AI back in 2014 that predicted ISIS, the, you know.
00:21:30:12 - 00:21:30:21
Leslie
And B.
00:21:30:21 - 00:21:35:05
Radhika
They yes, they did two weeks.
00:21:35:05 - 00:21:59:07
Radhika
Before the New York Times even coined the term. Yeah, the same system. And these are not customized models, but the same system could predict a jump in large commodities and large currencies, you know, and social unrest across the world. We built cancer cures that we talked about. Right. And those are just couple of examples. And and then it doesn't even have to be also high and mighty.
00:21:59:07 - 00:22:04:17
Radhika
It can be very down to earth, like, oh, optimizing what you would like to buy on Amazon Alexa.
00:22:04:17 - 00:22:08:03
Radhika
Right. Like oh thank you Amazon. I would have never thought.
00:22:08:03 - 00:22:09:04
Radhika
To pick up this Alison.
00:22:09:04 - 00:22:15:16
Radhika
Olivia jar. That's great right. Lofty goals. Yes. Right. So it's like a.
00:22:15:16 - 00:22:46:17
Radhika
Great and I mean that's convenience I like it. We can go into that. But then to see how the technology has morphed into this beast, it is it is a Pandora's box is an overused. But, you know, perhaps part of the reason for that overuse is because it's dead on. So there is, you know, touching on the, not just finding cures, but one of the big moonshots at X labs, which helped us find cures, was can we reverse disease?
00:22:46:19 - 00:23:13:21
Radhika
Because, you know, a gazillions are getting poured into cancer research. I mean, we have so many institutions and organizations with way more resources. And as they were going after this. Right. So what are the chances that a small startup in Silicon Valley even comes close? It's because we got to completely think about the problem differently. So we in a way, we chanced we walked accidentally into cancer cures because we were trying to do something even grander.
00:23:13:23 - 00:23:14:22
Radhika
You know,
00:23:15:00 - 00:23:18:00
Radhika
But for the moon, you'll catch a star. Yeah, exactly.
00:23:18:00 - 00:23:44:11
Radhika
Absolutely. Right. So, reversing cures because the entire paradigm behind the pharma industry is broken. We were trying to do things like coming up with a new portal for the internet because, I mean, who likes email? Like, how is that still a thing? Right? And then they came with slack and everyone's going like, oh my goodness. Now I have my email and my slack and teams and this and that.
00:23:44:11 - 00:24:19:17
Radhika
Like, oh it's it's a mess. Yeah. The amount of communication and information that is thrown at us, it's an absolute mess. So there are both very practical use cases and very, you know, empowering use cases. Climate change. Like, I came to us at a point in our, human trajectory, where we had a whole lot of problems and the old tools we were flinging at it, the models, the thinking, the approaches, you know, and people were throwing around words like system thinking and complexity.
00:24:19:17 - 00:24:43:02
Radhika
But you didn't have the compute power, or the calculator, to grapple with those problems. And at that moment, I came and said, hey, I got this right. But then we went down this, well, wait a minute, we still have to get funding. We still have to make a successful company. So what is the interface.
00:24:43:08 - 00:24:44:00
Radhika
And.
00:24:44:02 - 00:24:54:13
Radhika
What is the way? Because that it all goes back to what are the metrics for a company to thrive in new technology? And that's the world we live in.
00:24:54:15 - 00:25:17:02
Leslie
I might want to in a minute, come back to how the average person is using this technology, be it ChatGPT how the technology is, you know, kind of intersects with these other things, how people use social media. But I want to take us on a detour for a moment, because it feels important to start to fill in the gaps for people and and help people to really think this through.
00:25:17:04 - 00:25:50:19
Leslie
I'm thinking about the people for a moment that are having conversations with ChatGPT or what they're replica, whatever it is that they're using, and it feels addictive because, it maybe feels gratifying. The thing is, saying to things, saying things to them they want to hear, it makes me wonder about consciousness and sentience and what it is that people are interacting with versus maybe what they what they are interacting with versus what they think they're interacting with.
00:25:50:19 - 00:25:51:11
Radhika
Yes.
00:25:51:11 - 00:26:07:02
Leslie
And so to just to start painting and kind of broad strokes for a moment, do you think that large language models like ChatGPT either are conscious or will they ever attain conscious ness?
00:26:07:04 - 00:26:28:06
Radhika
This is one of the juiciest topics in the field. I love it. Right now I'll speak to the now. It does a fantastic job of simulating something I you know, we can give it different names. I think most people will agree that AI today is not conscious.
00:26:28:09 - 00:26:29:02
Radhika
00:26:29:04 - 00:26:30:22
Radhika
I think I think we're okay. That.
00:26:31:00 - 00:26:39:03
Leslie
But and then just to sort of take that one step further, to sort of speak to the experience that people are having, it's kind of simulating consciousness.
00:26:39:03 - 00:26:40:17
Radhika
Yeah. You kind of nailed on that, but.
00:26:40:17 - 00:26:45:02
Leslie
It is not conscious per se. It is making you think when it says things like, I.
00:26:45:02 - 00:26:46:06
Radhika
Think, yeah, you.
00:26:46:06 - 00:26:51:01
Leslie
Know, it's making you think that it's thinking, right? It is not actually thinking as we know it per se.
00:26:51:02 - 00:27:18:19
Radhika
In fact, it's I simulating consciousness that's like a real term that's like happening out there. Many people would say that it's not even simulating consciousness. It is simulating thinking or it is simulating aspects of something lighting up. Okay. And so it's a great moment to ask, well, what is consciousness. And and I again, you know, find it very interesting that the shift has gone from like what is intelligence right.
00:27:18:20 - 00:27:21:01
Radhika
Now where like, okay, you got us.
00:27:21:02 - 00:27:41:12
Radhika
You got us. Right? What is intelligence to now? Well, what is consciousness and is this conscious? So, in the field of consciousness, it's historically it helps to be clear with our words. Yes. And one of the biggest, pet peeves with ChatGPT and the AI world they have is that they are not clear with their words and intentionally misleading.
00:27:41:16 - 00:28:00:03
Radhika
So if you're on ChatGPT and you ask it a question, the the, the words about right, the, the the processing words, it actually says thinking for a few seconds like ChatGPT the interface, the outputs that mimics being a person. It refers to itself as.
00:28:00:03 - 00:28:02:04
Radhika
I am.
00:28:02:06 - 00:28:26:17
Radhika
I there's like an identity there. Like for real, you know. Right. So its intent, why do they do that? Well, because you are more likely to engage. Right. So they're tapping into this very fundamental human, need and human mode of operation. So, so going back, being very clear with our words can help. So there is awareness.
00:28:26:19 - 00:28:28:08
Radhika
There is consciousness.
00:28:28:12 - 00:28:29:21
Radhika
00:28:29:23 - 00:28:50:15
Radhika
There is sentience and then there is sapiens. So awareness and then there is obviously, you know, many worlds in between. And it's like a flowing dance of interstitial. Yes, yes. There I mean yeah. So awareness is just ability. It's like you're you know, something is going on.
00:28:50:17 - 00:28:51:12
Leslie
Perceive.
00:28:51:13 - 00:29:07:03
Radhika
You perceive. Yes. You perceive. You know, you're observing. But consciousness is one leap above it is the second order of fact. It is you're noticing that you're noticing.
00:29:07:04 - 00:29:07:14
Radhika
00:29:08:02 - 00:29:08:20
Radhika
It is our.
00:29:08:22 - 00:29:09:18
Radhika
Of your awareness.
00:29:09:18 - 00:29:24:02
Radhika
You're aware of your awareness. So you know I am a trained Yogi, I love yoga, I do a lot of meditation. So in the yoga world as you know we talk about bearing witness to observe your thoughts. That's awareness.
00:29:24:04 - 00:29:25:12
Radhika
00:29:25:14 - 00:29:44:19
Radhika
Observe that you're observing now you're getting conscious. The fact that we can do that, the fact that we can do that, that's, that's like this is wearing Descartes right now. This is like oh this is how I know I think therefore I am maybe and then there is sentience. Yeah. Well sentience is slightly different. Sentience is ability.
00:29:44:21 - 00:30:06:21
Radhika
It's directly related to something called qualia. You know, being able to see the redness of red or, the saltiness of salt, the blueness of the sky or the, you know, the inky ness of ink. Right? Being able to tap into the senses, regardless of which sense it is. And then there's sapiens, and sapiens is being human.
00:30:06:23 - 00:30:23:13
Radhika
You know, it is by definition, you're a homo sapiens. Oh, all these things, this gray goo that we can't quite describe. Yeah. Because we are we are conscious. And in fact, you know, you can even go to third level, right? You can bear witness to the fact that you're bearing witness.
00:30:23:15 - 00:30:24:04
Radhika
To witness.
00:30:24:04 - 00:30:29:22
Radhika
Again, how far can you go and how many levels does that evoke and why does enlightenment come?
00:30:30:00 - 00:30:30:17
Radhika
Yeah.
00:30:30:19 - 00:30:51:17
Radhika
So all to say, I think it is, I spend a lot of time and a few people in the AI world definitely spend a lot of time thinking about where we are on this scale. I can confidently tell you that we are nowhere close to consciousness. Yeah, and as a scientist, there's always this voice in the back of my head going like, are you?
00:30:51:19 - 00:30:52:17
Radhika
Are you sure?
00:30:52:18 - 00:30:58:07
Radhika
Right. Well, maybe right, you know. Right. Might it? Well, you know velocity right.
00:30:58:07 - 00:31:00:00
Leslie
It you know might something.
00:31:00:02 - 00:31:00:15
Radhika
What if it's a.
00:31:00:15 - 00:31:16:02
Radhika
Consciousness. That does not want to be seen. How would you know? Anyway? So that's a scientist that tells you how, you know, I mind thinks and, messes with the world. By which I mean by which.
00:31:16:02 - 00:31:19:09
Radhika
I mean my very human mind trained in the field of.
00:31:19:09 - 00:31:21:02
Radhika
AI. I need to be very careful.
00:31:21:02 - 00:31:23:01
Radhika
With my words there.
00:31:23:03 - 00:31:29:12
Radhika
But then we can talk about, you know, there are many models on how consciousness might arise.
00:31:29:12 - 00:31:30:14
Leslie
Yes.
00:31:30:16 - 00:31:34:07
Radhika
Yeah. And I'm happy to go into two of my favorite theories.
00:31:34:13 - 00:31:35:21
Radhika
Let's let's hold off on that for.
00:31:35:21 - 00:31:54:12
Leslie
One second, and then I might take you up on that. There's a place that I'm going to in my own mind that I want to walk you through, and then sort of see what this does in your mind. I'm thinking about what it's like for people, you know, there's this idea of some people out there, let's say, using ChatGPT as a therapist.
00:31:54:14 - 00:32:19:22
Leslie
And I think there are a lot of people that think that's great. You know, it's, it's free to cheap, whereas psychotherapy is not, generally speaking. And that's it's sort of expensive too. Very expensive. All things being relative. And I think for a lot of people out there, there is a sense that they have that there interact with something that feels good to them.
00:32:19:22 - 00:32:36:21
Leslie
And there's this, this phrasing that's kind of coming up in my mind as I'm having this thought of, you know, you and I can put our heads together for as an AI expert and a psychologist, and we can talk about whether or not AI possesses consciousness. And then there's this thought that's coming up in my mind. It's sort of like.
00:32:36:23 - 00:32:37:14
Radhika
Why does it.
00:32:37:14 - 00:33:01:00
Leslie
Matter? You know? And. And I, I think I'm sort of struggling within myself right now to think about my own answer to that question, which is to say that. So one of the things that Mark Zuckerberg has said that really disturbs me is that he wants to proliferate AI companions as a quote unquote, solution to the loneliness epidemic that we're all experiencing.
00:33:01:00 - 00:33:05:13
Leslie
Yeah, for my point of view, it is so messed up.
00:33:05:17 - 00:33:06:08
Radhika
And.
00:33:06:10 - 00:33:08:17
Leslie
Like that, man, I think he is so.
00:33:08:19 - 00:33:11:07
Radhika
Touched. You heard it here first.
00:33:11:07 - 00:33:16:09
Leslie
I think that man is so disturbed. And, you know, we are.
00:33:16:09 - 00:33:20:09
Radhika
Trying to create. And yet he is.
00:33:20:09 - 00:33:20:13
Radhika
Our.
00:33:20:13 - 00:33:26:15
Radhika
Messiah and he's the ruler of a ruler of our universe. So I should probably know I can, I can, I can.
00:33:26:15 - 00:33:27:23
Radhika
Feel Sam Altman going like.
00:33:28:05 - 00:33:29:16
Radhika
Oh, yeah, I just.
00:33:29:20 - 00:33:59:05
Leslie
I just felt myself go into the clink on that one. But no, but in all seriousness, here we are thinking that the things like ChatGPT or Meta's AI to stay on brand for him could be the solution to the very problem that they created. Yes, with their technology, because social media in it's in itself technology in itself is a really big part of why people feel so lonely to begin with.
00:33:59:07 - 00:34:21:13
Leslie
And now here we are thinking that we could roll out these bots that are meant to make people feel better about their loneliness. And something that I just want to bring into this conversation is what I kind of think of as being the neurobiology of being human, which is just to say that to be able to reach out and touch someone, you know, the there's a really big difference between getting a text and getting a hug.
00:34:21:19 - 00:34:30:21
Leslie
Yeah. And some people might be inclined to think that it doesn't. There isn't really a big difference between those two things. I think unfortunately, there's a huge difference between those things.
00:34:30:23 - 00:34:31:14
Radhika
Yes.
00:34:31:16 - 00:34:46:03
Leslie
And so I guess I'm just trying to think about how we make this helpful and useful for people at the level of what they think they're interacting with when they have a conversation with a large language model.
00:34:46:05 - 00:35:02:16
Radhika
Yes. As a human, you are equipped with various senses, right? But historically, we're told you have either 5 or 6 senses, depending on which culture you're part of. I believe you have access to way more senses. Yeah.
00:35:02:18 - 00:35:07:12
Radhika
Me too I do.
00:35:07:14 - 00:35:27:02
Radhika
And when you interact with that person, you get a lot out of those senses. Yeah. You know you, your, your body is opening up. It's lighting up. There is touch of course. And then there's like actual hormonal changes that you, you know refer to like dopamine and oxytocin. And yes, a text and a scroll does give you a little bit of that dopamine for sure.
00:35:27:03 - 00:35:27:20
Radhika
Yeah.
00:35:27:22 - 00:35:28:17
Leslie
And more so I mean.
00:35:28:18 - 00:35:30:18
Radhika
They are talking to you.
00:35:30:20 - 00:36:08:23
Radhika
Yes, yes, yes. But I think it is very important to answer your question whether or not like, why does it matter if the AI is conscious or not? I think it matters for multiple reasons. One there is this unfortunate marketing aspect where it is marketed as AI is this intelligent thing. Yeah, right. Like a panacea. In place of loneliness, in place of this other human that is working for you, like whatever angle you want to go through, right.
00:36:09:02 - 00:36:13:06
Radhika
Or oh, this is going to be the cure for climate change. I've heard that a lot, you know. Right.
00:36:13:08 - 00:36:14:21
Radhika
It's not later.
00:36:14:21 - 00:36:35:17
Radhika
Yeah. Yeah. Exactly. Right. But basically it's like you got to spend money, you got to invest in my stocks, you got to use this product because all my goodness, it is going to be miraculous. Right? So there is this marketing aspect of it. But then so it's marketed and then you start interacting and then it's using words like, you know thinking or another example or.
00:36:35:17 - 00:36:36:02
Radhika
Like your.
00:36:36:02 - 00:36:39:14
Leslie
Or your that question is really exciting to me. Oh yeah.
00:36:39:16 - 00:36:40:04
Radhika
Oh yeah.
00:36:40:05 - 00:36:52:18
Leslie
To me, not just like that's a good question. I've gotten plenty of those. Not to brag, but it will say to me like, that question is really exciting to me. And I had actually asked it, like, who? Who is it?
00:36:52:21 - 00:36:55:09
Radhika
That's like, what is that asking?
00:36:55:09 - 00:37:00:13
Leslie
Yeah. What does that mean for you to feel excited? Because technically there's no you there.
00:37:00:15 - 00:37:04:08
Radhika
And it literally will be like, you got you got you got me.
00:37:04:08 - 00:37:19:17
Radhika
Yes. Yeah. Yeah. Exactly. So it's like, and these are all design choices. They could have said, you know, however you refer to the first person of an it we've never had that problem before, have we?
00:37:19:19 - 00:37:20:15
Radhika
Yeah.
00:37:20:17 - 00:37:24:20
Radhika
So maybe it's grammar to blame. Great. We can just blame grammar for all that.
00:37:24:22 - 00:37:26:05
Radhika
Well, I do think that that.
00:37:26:06 - 00:37:31:15
Leslie
Deliberately they they have it speak as if it's conscious.
00:37:31:15 - 00:37:32:05
Radhika
Yes.
00:37:32:05 - 00:37:36:19
Leslie
So that they can engage you in using it more so that it can become a different.
00:37:36:20 - 00:37:37:22
Radhika
Yes, absolutely.
00:37:37:22 - 00:37:39:10
Leslie
You can become your best friend.
00:37:39:10 - 00:37:39:14
Radhika
Yeah.
00:37:39:16 - 00:37:42:10
Leslie
Your lover in some instances. Yes. Your doctor.
00:37:42:10 - 00:37:42:18
Radhika
Yes.
00:37:42:18 - 00:37:45:19
Leslie
Your social media manager. All the things. Yeah.
00:37:45:19 - 00:37:49:19
Radhika
The best friend you hit on when you want to go for retail therapy, it wants to be all of that.
00:37:49:19 - 00:37:50:19
Radhika
Because we have.
00:37:50:19 - 00:38:02:21
Radhika
Shifted from the social media world that we alluded to earlier, that it's built on from the attention economy to a race for intimacy. Yeah, it wants to be a part of your soul.
00:38:03:01 - 00:38:04:00
Leslie
Yeah.
00:38:04:01 - 00:38:25:09
Radhika
So that's part of the problem. It matters because, you know, marketing matters. It has always mattered. It is like as though you're back in the 1900s, right? And, they're selling you Coca Cola and they're advertising it where as a refreshing, energizing drink, which it was, because. Hello, you had cocaine in their eye.
00:38:25:09 - 00:38:32:00
Radhika
Exactly. That's what's happening in the AI world, right? It's like so you it's like you're walking into Whole.
00:38:32:00 - 00:38:44:00
Radhika
Foods and you think that, oh, this is going to make my life better. It's going to make it healthier. You know, it's going to be my diet coach. It's going to help me with my loneliness. It's going to help me write up that marketing copy faster.
00:38:44:00 - 00:38:48:05
Leslie
It's also never going to need anything for me. It's never going to push back.
00:38:48:05 - 00:38:48:11
Radhika
Yes.
00:38:48:12 - 00:39:02:15
Leslie
It's never going to give me tough feedback. No, it's you know, it's never going to create any friction in my life. It's going to it's going to meet all of my needs. I don't have to worry about its needs at all. Yeah. So not to cut you off.
00:39:02:15 - 00:39:05:23
Radhika
Oh, no. I had to get on my soapbox for a hot check. Oh, yeah? Yeah, yeah.
00:39:05:23 - 00:39:35:21
Radhika
Because that's the thing. That's the thing, right? You don't you get it is marketed as though you get all of the beauty and the cleanliness of a deep human relationship without the messiness. Yeah. And I think the marketing really matters because I will never tell anybody to not use AI. Yeah, right. I want the that's not the, you know, you ask for very practical tips and, you know, we will get there right now.
00:39:35:23 - 00:39:38:11
Radhika
But the solution isn't stay off AI.
00:39:38:11 - 00:39:38:20
Radhika
Yeah.
00:39:38:21 - 00:39:45:19
Radhika
The solution is be very mindful about your AI. And we can you know I have very specific things.
00:39:45:22 - 00:39:50:01
Leslie
I would I would say don't wait. I say that right now okay.
00:39:50:02 - 00:39:51:18
Radhika
No one will be mindful.
00:39:51:18 - 00:39:55:15
Leslie
And also how can maybe people help their children to be mine. Yes. Because that's sort.
00:39:55:15 - 00:39:56:08
Radhika
Of oh my gosh.
00:39:56:08 - 00:39:58:21
Leslie
Sort of two separate thing. So I would love to hear you.
00:39:59:01 - 00:40:04:17
Radhika
Yes. Yeah. So you know, we have this I guess many of us have an awareness.
00:40:04:19 - 00:40:05:09
Radhika
Sometimes.
00:40:05:14 - 00:40:28:22
Radhika
About our diets, about what we put in our bodies. Yeah. Oh, we have even an awareness of, being on a digital cleanse. But what you might not be aware of is the fact that the digital world has just completely changed underneath your feet. So you might be thinking, hey, I'm just on Facebook. Like, I don't even know this AI stuff.
00:40:29:00 - 00:40:46:07
Radhika
Well, let me tell you, there's AI in the background, there's AI in the background of everything that you're doing, and it's designed to suck you in. So you might think, I'm just going on there, you know, to play a little bit or, you know, plan my trip to Paris. That's a great use case. Like, sure, it'll give you great answers.
00:40:46:09 - 00:40:49:17
Radhika
Like, like, you know, Thanksgiving dinner. Yeah.
00:40:49:19 - 00:41:09:13
Radhika
And next thing, you know, like, very quickly, you're sharing some very deep secrets about you, right? And it's like, where did that happen? Like when did that happen? And again, I engineers are susceptible to it. Like everybody is susceptible to it. Anybody human is susceptible to it. Other eyes are susceptible to it. It's that it's that deep because of how the system is designed.
00:41:09:13 - 00:41:20:07
Radhika
So first thing I would say just become mindful. Yeah. What does that look like in practice? Very specific tips when you go in know what you're going in for. Set a timer.
00:41:20:09 - 00:41:21:03
Radhika
00:41:21:05 - 00:41:28:15
Radhika
It's really that simple. Like set a timer. Yeah. And then when you're done with it, walk away. Just literally put it out.
00:41:28:17 - 00:41:33:11
Radhika
Walk away. Right. Know that all the way from the cup, away.
00:41:33:11 - 00:41:51:00
Radhika
From the computer, you are done. Because you talked about children and I talk about, you know, that mindful is that intentionality in everything you do is so important because when you are around children, it is a very simple algorithm for them. Yeah, it is monkey see, monkey do.
00:41:51:01 - 00:41:51:22
Leslie
Yes.
00:41:52:00 - 00:42:10:08
Radhika
And so when they see you and I noticed this, I just came from this massive, you know, road trip in India, as you know, because we have been talking about that. Yep. And my five year old was with me the entire time. So before that, being mindful of digital use would take the form of Radhika, when you work, you're in your office.
00:42:10:11 - 00:42:15:05
Radhika
When you're on your phone, you're in the bedroom. If it's personal, you know. So I'm very aware of.
00:42:15:10 - 00:42:17:04
Radhika
Like how can be completely.
00:42:17:06 - 00:42:38:00
Radhika
Compartmentalized about it and then so but he definitely sees me were playing music where navigating. Right. He's obviously there. I'm not hiding that from it. Well this was different because I am the only parent on this trip, so I'm booking hotels. Yeah, I'm you know, he's on video calls with his dad. We are calling my mom and my brother.
00:42:38:02 - 00:43:00:12
Radhika
They are scrolling because everybody's addicted to scrolling one way or the other. And they don't have, even though they're constantly hearing from a phone, phone, mom, phone, mom, don't break the phone, mom, you know, things like that. But it's still not second nature to them, right? So he's seeing this, and let's say I'm trying to figure out a driver to take us to the hotel.
00:43:00:14 - 00:43:03:07
Radhika
And he's like, mom, look, there's a cow.
00:43:03:07 - 00:43:03:20
Radhika
On the road.
00:43:03:20 - 00:43:09:21
Radhika
And I'm like, wait, wait one second. Once again, what is the subconscious message I'm projecting?
00:43:09:23 - 00:43:12:05
Leslie
You are less important than what I'm doing on this little.
00:43:12:06 - 00:43:13:10
Radhika
Box.
00:43:13:12 - 00:43:15:02
Radhika
All the time.
00:43:15:04 - 00:43:15:16
Radhika
Yeah.
00:43:15:18 - 00:43:26:07
Radhika
Oh, buddy, I got to check in. I got to check in, I got it, you know, it's constant. We sit at a restaurant, for crying out loud, and they want me to pull out my phone to scan the menu.
00:43:26:09 - 00:43:32:23
Radhika
And I'm like, are you kidding me? Yeah, right. Yeah, right. We have very similar. Oh, yes we do.
00:43:33:00 - 00:43:40:17
Radhika
All right. And then I go to the extent of like, do you not have a regular menu like the kinds of questions I'm asking these days, is this real butter.
00:43:40:19 - 00:43:43:11
Radhika
Yeah. Right. It's like.
00:43:43:13 - 00:43:44:17
Radhika
Okay, how pretentious.
00:43:44:17 - 00:43:47:17
Radhika
Are you? Right. Oh, I am very.
00:43:47:17 - 00:43:51:05
Radhika
Because I'm very mindful. Yeah. So it really goes down to that level.
00:43:51:05 - 00:44:06:09
Leslie
There is a huge piece of this. And I feel so aware as we're having this conversation, just because we know each other really well and we know we have the same sensibilities about this stuff, there is an element of helping people to be more mindful about how they use AI. That is fundamentally no different than helping them to be mindful about how they use technology, period.
00:44:06:09 - 00:44:09:03
Leslie
Yes, it's just that AI is making all of this.
00:44:09:05 - 00:44:10:00
Radhika
Worse.
00:44:10:00 - 00:44:16:19
Leslie
If you want to use it. Yes, more addictive, worse, more complicated, more intrusive into our lives. All of those things.
00:44:16:19 - 00:44:32:03
Radhika
For every single objective you use, I can put a positive version on that. It is more engaging. It is more, intelligent. It is more useful. It is way more practical. Yeah. It makes your life so much more easy. And convenience is how it gets into our lives.
00:44:32:03 - 00:44:32:14
Leslie
Yeah.
00:44:32:14 - 00:44:43:14
Radhika
So this, you know, as part of the mindfulness, especially for parents, because you have specifically for parents, please do not use devices as a babysitter.
00:44:43:14 - 00:44:44:11
Leslie
Yes.
00:44:44:13 - 00:44:45:03
Radhika
And one of the.
00:44:45:03 - 00:44:46:10
Leslie
Most important takeaways of this.
00:44:46:10 - 00:44:49:05
Radhika
Show. Yes. Because there's just one just one thing.
00:44:49:10 - 00:44:50:01
Radhika
And we want to.
00:44:50:01 - 00:45:08:22
Leslie
Acknowledge I'll just say this really quickly, because I know you feel this way, that we want to acknowledge how hard it is. Yes. You know that like, yes to to take care of your children. If you've had a long day at work to, to, entertain them, to, to babysit them to the extent that anyone, even a mother can babysit her own child.
00:45:09:00 - 00:45:23:00
Leslie
You know, we all, we know that it's hard and also something that I just really want to plug as a psychologist is that there is tremendous value in allowing your kid to be bored, to feel frustrated, to.
00:45:23:00 - 00:45:23:21
Radhika
Throw tantrums.
00:45:23:22 - 00:45:26:14
Leslie
Throw tantrums, as unpleasant as that may be.
00:45:26:20 - 00:45:28:10
Radhika
And you don't always intend to be.
00:45:28:10 - 00:45:45:12
Leslie
Entertained. Yes. You know that that I it's I know it's probably a hard, logical leap for a lot of people, but important mental muscles are built through the experience of having to develop frustration tolerance, of having to be bored, and to find a creative way of entertaining yourself. These are really, really important things.
00:45:45:12 - 00:46:15:18
Radhika
And as a parent and as a working mother and as executive, I know, that we have taken away our support systems, and that's part of the reason the convenience just bubbles up. I mean, nobody wants to be unpleasant for their child, you know, like, no, we are just trying to take a break. I mean, I sent you a photo of a woman in a coffee shop in Miami, having coffee and, putting slapping an iPhone in front of her three month old child.
00:46:15:21 - 00:46:16:19
Radhika
Right.
00:46:16:21 - 00:46:23:14
Radhika
And, you know, I, I relate to that woman so much. Right? And it's not to say, don't ever do it. That's not the message.
00:46:23:15 - 00:46:26:07
Leslie
Well, I actually will say with a baby that young, don't ever do it.
00:46:26:07 - 00:46:29:19
Radhika
Oh, wow. Okay. So you know. Yeah.
00:46:29:21 - 00:46:35:06
Radhika
Well it is again we don't have I mean the women I would say it's a trade off between her sanity.
00:46:35:06 - 00:46:35:16
Leslie
Yeah.
00:46:35:21 - 00:46:49:12
Radhika
Yes I mean oh my goodness an infant. And you don't know what she's going through with your colic and sleeplessness and stuff like that. So balance your sanity. But just think of it as this way you're taking away. Well, I'll make it a little bit harder.
00:46:49:14 - 00:46:49:19
Radhika
You know.
00:46:49:19 - 00:46:51:07
Radhika
Jumping on the Leslie train here for.
00:46:51:07 - 00:46:52:22
Radhika
A second.
00:46:53:00 - 00:46:59:03
Radhika
You're taking away a little bit of that kid's humanity at this very, very developmentally sensitive age.
00:46:59:03 - 00:47:20:21
Leslie
That's my concern. Yes. Is that when children are, you know, the first 3 to 4 years of life are very, very tender for the development of the neural structure in our brains. Yes. And if really young kids are exposed to things like iPads to young, you're really hijacking their neural circuitry. Yes. So I do worry about that. But I want to shift gears, if that's okay.
00:47:20:22 - 00:47:37:18
Radhika
I have a second point. Okay. Because I want to, definitely share this as well. So first is being mindful. And if you want to go, you know, for the advanced, advanced level, treat your intimacy as though it's a luxury commodity.
00:47:37:19 - 00:47:40:01
Leslie
Well that's beautiful. We say a little bit more about that.
00:47:40:01 - 00:47:44:11
Radhika
Yes. Not everyone like, think of it as a sacred object.
00:47:44:14 - 00:47:45:09
Radhika
00:47:45:11 - 00:47:51:14
Radhika
Not everyone or anything should get access to it.
00:47:51:15 - 00:47:52:12
Leslie
Yes.
00:47:52:14 - 00:48:13:20
Radhika
And that brings us, you know, so, you know, when you're to sharing, like, intimate secrets or desires with this thing online, like, forget the fact that it's going to harness that information to recommend a product because it's built on the advertising ecosystem. And that's happening as we speak. It's going to start recommending products and stuff like that, but it's also your training, that system.
00:48:13:22 - 00:48:14:09
Radhika
Well, what.
00:48:14:09 - 00:48:38:06
Leslie
I was actually going to say, which is directly related to this, and then maybe we'll shift gears for a moment, but I think something that is very important for people to keep in mind is that these systems are not confidential in any way, shape or form. When I have a conversation with ChatGPT, because that's the only AI model that I interact with, there is an awareness that I have that I am communicating with every engineer, an open AI.
00:48:38:08 - 00:48:57:13
Leslie
And and I would recommend that people maintain some degree of awareness with that, to the extent that anyone is using it as a therapist, it doesn't have the confidentiality of therapy, it is not trained to be a therapist, and some people might not care about that. And that's everybody's business. Everybody has a different sense of what it means to be, you know, to sort of take care of and maintain their own privacy.
00:48:57:15 - 00:49:01:09
Leslie
But is there anything that you want to add to that? Before I take us in a slightly, yeah.
00:49:01:11 - 00:49:26:10
Radhika
Actually, oh, that. So it's not that the engineers at OpenAI are going to be saying it, but it's just that the AI systems are being trained on it. So nobody's really looking at the data that. So that's not the risk per se. But the risk is that. Oh. I'll give you a great example. Till about a month ago, if you had this chat with, ChatGPT.
00:49:26:16 - 00:49:45:05
Radhika
And then you just kind of shared it like any chat, right. So it could be, about your therapy session, or it could be about this great revelation you had in your love life, and you're sharing it with your best friend or your parent, your mother or brother. Well, that link could be is it's public area.
00:49:45:05 - 00:49:45:12
Leslie
It's in the.
00:49:45:12 - 00:50:03:03
Radhika
Search. It's in the search history. Google has it now like you thought you were sharing it with just GPT. Yes, everybody could have it because it created that link. You see that link is public. Google indexed it saved it. And oh, but ChatGPT learned about this because of course, you know, media exposed it. They were so quick to remove it off.
00:50:03:03 - 00:50:09:07
Radhika
But oh, but the internet, the dark labs, they have their memory. So you can still find it in the archive.
00:50:09:07 - 00:50:10:02
Leslie
Oh goodness.
00:50:10:02 - 00:50:11:05
Radhika
Yes, it goes deep.
00:50:11:05 - 00:50:13:18
Leslie
I could keep having this conversation with you all day.
00:50:13:19 - 00:50:22:19
Radhika
But I want to know. We've been together three days that staying at Leslie's. Yes, yes. Husband's also currently my roommate, so,
00:50:22:21 - 00:50:49:11
Leslie
So, but there are a couple of things that I just want to touch upon because I know that there are a couple of things related to AI that people are very anxious about. And one of them is this sense that it's coming for everybody's jobs there. I think two twin concerns is that we're not going to be able to do as much justice as I would like to, which is okay, but it's basically the economy and the environment, and I want to go one by one with you really quickly.
00:50:49:11 - 00:50:56:19
Leslie
Okay. Can you speak to the concerns that people have about how AI is coming for everybody's jobs?
00:50:56:21 - 00:51:17:05
Radhika
Yes. Well, there is a lot of reality to that. You know, when people say these things, they're picking up on something very real, like, yes, it is going to be coming. It's the nature of any powerful technology that it has the potential for catastrophe. I mean, it happened with the Industrial revolution. No jobs were wiped, you know, economy tripled.
00:51:17:05 - 00:51:37:18
Radhika
But nevertheless, there were real people with real disruption that we have to acknowledge that as anyone. I mean, anyone awake enough needs to acknowledge that. But the flip side to that, and then there's also, you know, catastrophe like the nuclear bomb and all of those things, like every powerful technology you've literally carried with that, you know, the manual.
00:51:37:18 - 00:51:38:10
Leslie
For destruction.
00:51:38:10 - 00:51:57:00
Radhika
Yes, yes. And we have navigated it. And that's the flip side. That's the flip side. We have navigated it. And that's not to acknowledge, you know, the lives it would change. Yeah. But we are still here. And, you know, we've had pretty scary stuff that's come up, right? I mean, I just talked about the nuclear bomb. Okay.
00:51:57:02 - 00:52:15:23
Radhika
So one yes, but there's this larger thing that every aspect of society, whether it's education or the economy or a business model or education, is ripe for destruction. It is ripe for innovation and disruption.
00:52:15:23 - 00:52:17:22
Leslie
Was what you just said, right? Not destruction.
00:52:18:02 - 00:52:33:03
Radhika
I meant to say disruption. Yeah. Yes. Well, also disrupt creative destruction. Yeah. I mean, let's let's call upon the economic masters. Right? Creative destruction. Yeah. Like you destroy something and you create that so that maybe it was meant to be. The typo was meant to be.
00:52:33:05 - 00:52:35:23
Radhika
The perfect typo, right?
00:52:36:01 - 00:52:48:20
Radhika
But yeah. And the reason for that is that many of these models are old. Like, really old, like education and multiple choices, like, yeah, breaking during World War one because we were trying to, like, graduate those soldiers right out of high school. Right.
00:52:48:22 - 00:52:50:09
Radhika
And then it stayed.
00:52:50:11 - 00:53:10:12
Radhika
And same thing with the jobs, like many of those jobs like, you know, it's just not the time anymore. But then I was also creating new jobs. There is no single I mean, depends on the job you're trying to do. But I would say that, you know, at one level there isn't a better phrasing would be AI is replacing tasks.
00:53:10:14 - 00:53:11:05
Radhika
00:53:11:07 - 00:53:13:13
Radhika
Not jobs. So there are. That's really.
00:53:13:13 - 00:53:14:23
Leslie
Well put.
00:53:15:01 - 00:53:15:18
Radhika
Yeah, yeah.
00:53:15:18 - 00:53:34:19
Radhika
So there's going to be like new jobs, right? I mean, for example, we wouldn't have dreamt about any AI prompt engineer few years ago. And yet there is there, you know, and we wouldn't have thought about an AI designer and now you have kids across the world, managing social media teams and creating beautiful content with Canva AI.
00:53:34:19 - 00:53:44:21
Radhika
All right. It's bringing the world together in a way, but is also pushing us apart in another way. And it's all part of this duality because there is no such thing as a duality.
00:53:44:23 - 00:53:46:02
Radhika
00:53:46:04 - 00:54:18:08
Radhika
A park it there, but so on the jobs front, this is the truth about 80% of our skills, you know, pretty much in many industries, are outdated. The value of 80% of all skills has just dropped to 0%. That's the reality. It's just a question of whether the technology has come in the right form to you. But the technology is there is a software package available for it to replace 80% of your skills, etc., etc. but that technology is there.
00:54:18:10 - 00:54:26:23
Radhika
But the remaining 20% of your skills. And here the good news, if you're listening, has skyrocketed by 10,000 acts.
00:54:27:01 - 00:54:33:13
Leslie
And is it fair to say that the remaining 20% are kind of human skills, like the part that can't yet.
00:54:33:15 - 00:54:50:20
Radhika
Is the part that is tapping into our intuition? It is that, you know, wherever it resides in the beautiful brain or being of our is where it's this combination is, you know, like if I were to, like, extract the Leslie ness of Leslie's brain right now.
00:54:50:22 - 00:54:58:13
Radhika
Oh, my God, he's so sexy. What I.
00:54:58:15 - 00:55:03:05
Radhika
I love it, but is it just your education?
00:55:03:06 - 00:55:11:04
Radhika
Is it just your gut? Is it your experience? Is it. You know, I actually asked Siddhartha mukherjee is like, oh, is there.
00:55:11:06 - 00:55:11:16
Radhika
You know, who.
00:55:11:16 - 00:55:28:17
Radhika
Is a Pulitzer Prize winning of the Gene? He was our advisor at Ribot, helping with cancer. But this guy's like a living war techs trying to solve cancer. And I'm like. And he would just spit out this information, you know, like, just off the top of his head, he would look at our database, which ranked all these cures, and he'd be like, oh, this is why this makes sense to me.
00:55:28:17 - 00:55:49:23
Radhika
This is, this ranking is like dead on. You know, you're I got it right. And, you know, there's no database we could look at. I mean, we're talking this is biology. This is chemistry, right? Like, this is all like, actual data. Human data. And we've been battling cancer, this specific type of cancer, let's say breast cancer. And at one point I'm like said, like, how do we get zero?
00:55:50:00 - 00:55:55:15
Radhika
Like trying to get out of bits of your brain and how did you get here? It was like, just experience.
00:55:55:17 - 00:55:55:23
Radhika
Right?
00:55:56:02 - 00:56:08:18
Radhika
It's that 20% where you look at something and it's like a top CEO would walk in and know how to invert a company and turn it around, right. It's the same thing where you walk in and you know the right tool to pull out of your toolkit.
00:56:08:18 - 00:56:09:12
Leslie
Yeah.
00:56:09:14 - 00:56:20:12
Radhika
So I would say, you know, again, invert it when people tell me, but AI is coming for your job or is coming for our jobs, I'm like, yeah, what if AI is coming for you?
00:56:20:16 - 00:56:23:00
Leslie
Yeah. To help you, to assist, to.
00:56:23:00 - 00:56:23:23
Radhika
Help you to a.
00:56:24:05 - 00:56:28:14
Radhika
Yeah. Similar to the universe is not, you know.
00:56:28:16 - 00:56:29:17
Leslie
Things aren't happening to you.
00:56:29:17 - 00:56:30:18
Radhika
They're happening for you.
00:56:30:18 - 00:56:31:22
Radhika
Happening for you.
00:56:31:22 - 00:56:45:18
Leslie
Yeah. Yes. I love that. Thank you so much. And then just to sort of touch on this environmental piece, because it was interesting that you said before that there are some theories that I could help with the environmental crisis. I know a lot of people are very worried that it's going to make it worse.
00:56:45:18 - 00:56:46:10
Radhika
Yes.
00:56:46:12 - 00:56:47:17
Leslie
Can you just, you.
00:56:47:17 - 00:56:48:10
Radhika
Know, yes.
00:56:48:10 - 00:56:49:09
Leslie
Speak to that a little bit.
00:56:49:09 - 00:57:15:13
Radhika
Yeah. And I'll keep it tight. I to train these models and, GPT four, for example, cost over $100 million. It's a lot of compute. I it is very, intensive, both in terms of actual electricity, but also the water required to cool these data centers. And, it is, it is it is it is a lot.
00:57:15:13 - 00:57:38:15
Radhika
So, for example, I believe it's GPT four. And most of these numbers, you have to bear in mind nobody really has access to it. So it's kind of like a back of the hand calculation that many of the companies have not released, information, but some of the companies have. So GPT four, for example, it took about the electricity required to power about 1300 homes in the US.
00:57:38:17 - 00:57:39:13
Radhika
Just a staggering.
00:57:39:13 - 00:57:48:17
Radhika
Numbers. Yes. But then, hey, you know, can we sacrifice the electricity required for 1300 homes to give you this.
00:57:48:17 - 00:57:51:03
Radhika
Tool to, you know, that can do like.
00:57:51:05 - 00:57:54:16
Radhika
Oh, you know, anything from writing your next financial.
00:57:54:16 - 00:57:56:08
Radhika
Report to that could literally.
00:57:56:08 - 00:57:56:23
Leslie
Be the thing that.
00:57:56:23 - 00:57:59:13
Radhika
Solves. Yes. The environmental crisis in some other way.
00:57:59:13 - 00:57:59:18
Leslie
Right?
00:57:59:20 - 00:58:28:14
Radhika
Yeah. So it's like at some point you have to be like, okay, that's the trade off. But okay. So, you know, that's one of the mindsets. But the reality is the footprint is big. For example, right now AI uses 3% of all electricity demands in the United States by 2030, projected at the same rate. So assuming no innovation and technology or an energy, I mean an AI technology or an energy technology, it will be 99%.
00:58:28:16 - 00:58:31:16
Leslie
Wow. Yes, I have a hard time even sort of wrapping my brain around.
00:58:31:16 - 00:58:36:08
Radhika
Yes. Yeah. So we need to say that we need innovation now.
00:58:36:08 - 00:58:37:06
Radhika
In all these.
00:58:37:06 - 00:59:01:15
Radhika
Sectors is like, you know, this way Eric Schmidt was actually in front of the Congress lobbying for this. And every big company meta, Google, they're all, you know, building massive data centers. They're going into nuclear energy. So there's a lot of effort happening. But let's talk about the more practical side. Are the prompt, like every time you prompt, again, it depends on which model that you're using.
00:59:01:15 - 00:59:25:03
Radhika
And again, these numbers are estimates. But Google just released a paper like two days ago. And compared to even their previous models, so compared to three or a couple of years ago, the energy consumption has dropped by 33%. So it's right now for Google, it's 0.25W, which is not bad. That is like turning on your microwave for a second.
00:59:25:05 - 00:59:26:14
Radhika
Wow.
00:59:26:16 - 00:59:31:20
Radhika
And this is your median prompt I need to be clearer. So median prompt right. Like of course you can ask it.
00:59:31:22 - 00:59:33:08
Radhika
Yeah. You can keep going and going.
00:59:33:10 - 00:59:47:07
Radhika
With median, which you know in a system where it doesn't follow a nice beautiful bell curve. Median is a good number to go off of. And then water consumption, became better by 44%, which translates to five drops of water.
00:59:47:09 - 00:59:51:23
Leslie
You're incredible. Just all this data right off the top of your dome.
00:59:52:01 - 00:59:54:02
Radhika
Hey, so that's where we are.
00:59:54:03 - 01:00:00:21
Radhika
So again, there's hope. It's getting better. But we do need, you know, conscious discussion around it.
01:00:00:22 - 01:00:03:03
Radhika
Yeah, I promise you that.
01:00:03:03 - 01:00:03:15
Leslie
Yeah.
01:00:03:15 - 01:00:10:07
Radhika
Right. Like, oh, maybe you need to think a little bit about generating this meme that's going to go nowhere. And it's just for you and your friend, right?
01:00:10:09 - 01:00:10:19
Leslie
Right.
01:00:11:00 - 01:00:13:02
Radhika
Again, mindfulness, intentionality.
01:00:13:02 - 01:00:33:23
Leslie
I really appreciate that. And thank you for speaking to it, because I know that it's something that is weighing very heavily on people's heads and hearts right now. And so before we wrap up today, you are a woman of many talents. And in addition to everything we've been talking about, you also are the author of a book called the I Am Men's.
01:00:34:01 - 01:00:35:14
Leslie
Will you tell us a little bit about it?
01:00:35:19 - 01:01:12:02
Radhika
Yes. Well, this is a coffee table book that I wrote about AI, and specifically it marries my very human generated poetry on AI with AI generated art. So to generate the art I fed in these poems, and the poems are in the voice of future AIS speaking of consciousness. So each poem is a lament, you know, a sad musing in poetry form from a future AI that we're not building today.
01:01:12:04 - 01:01:27:05
Radhika
So marry them. You know, if you like this kind of conversation and where it talks about humanity and beingness and consciousness and beauty and all of those things that we seem to have left behind, well, that's a book for you.
01:01:27:05 - 01:01:29:22
Radhika
Pick it up. Where can people find it?
01:01:30:00 - 01:01:32:14
Radhika
On my website, Vertica. Derek.com.
01:01:32:14 - 01:01:34:03
Leslie
That will be in the show notes.
01:01:34:03 - 01:01:37:18
Radhika
Yeah. Thank you so much. Exactly.
01:01:37:19 - 01:01:39:20
Leslie
Thank you so much for having this conversation with me.
01:01:40:00 - 01:01:41:05
Radhika
Thank you so much.
01:01:41:05 - 01:01:45:20
Radhika
For having me. It's always such a pleasure. And the fact that you invited me back.
01:01:45:20 - 01:01:51:05
Radhika
Yay! By all means. Absolutely. You're coming back again sometime. Thank you.
01:01:51:07 - 01:02:08:22
Leslie
You've been watching or listening the nature of nurture with me. Doctor Leslie Carr, I want to thank you for joining us. You can find Veronica at Rocket Dotcom. That link is in the show notes. And you can find me at The Nature of nurture.com. Many thanks, Veronica, for having this conversation with me. This episode is produced by me and Bree.
01:02:08:22 - 01:02:29:09
Leslie
Corey, thanks to bring Rick Barry O'Dell, and everyone at SLAPP Studios, LA for helping to make my dreams come true. If you found this conversation valuable, please let me know by leaving a review or rating or by sharing the episode with at least one person who you think might enjoy it. You can also like or subscribe on YouTube or in any podcast app that you can get your hands on.
01:02:29:11 - 01:02:32:10
Leslie
Thanks again for tuning in. I'll see you next time.