Dwarkesh Patel (Host) 00:00.340
end of those systems, then there then I can see it go well. At least for quite some time. And then there is the question of what happens in the long run. What happens in the long run? How do you achieve a long run equilibrium? And I think that there There is an answer as well.
Dwarkesh Patel (Host) 00:20.820
And I don't like this answer. But it needs to be considered. In the long run, you might say okay,
Ilya Sutskever (Co-founder and Chief Scientist) 00:29.140
so if you have a world where powerful eyes exist. In the short term, you could say okay, you have universal high income. You have universal high income. And we all doing well. But we know that what do the Buddhist say? Change is the only constant. And so things change. And there
Ilya Sutskever (Co-founder and Chief Scientist) 00:47.060
is some kind of government political structure thing, and it changes. Because these things have a shelf life. You know, some new new government thing comes up and it functions and then after some time, it stops functioning. That's something that you see happening all the time.
Ilya Sutskever (Co-founder and Chief Scientist) 01:03.380
And so I think that for the long run equilibrium, one approach, you could say, "Okay, so maybe every person will have an AI that will do their bidding." And that's good. And if that could be maintained indefinitely, that That's true. But the downside with that is okay, so then
Ilya Sutskever (Co-founder and Chief Scientist) 01:22.500
the AI goes and like earns earn you know earns money for for the person and you know advocates for their needs in like the political sphere and maybe then writes a little report saying okay here's what I've done here's the situation and the person says great keep it up. But the
Ilya Sutskever (Co-founder and Chief Scientist) 01:38.740
person is no longer a participant. And then you can say that's a precarious place to be in. But so I'm going to preface by saying I don't like this solution, but it is a solution. And the solution is if people become part AI with some kind of neuralink plus plus. Because what
Ilya Sutskever (Co-founder and Chief Scientist) 02:00.140
will happen as a result is that now the AI understands something and we understand it too. Like because now the understanding is transmitted
Dwarkesh Patel (Host) 02:09.060
wholesale. So now if the AI is in some situation, now it's like you are involved in that situation yourself fully. And I think this is the answer to the equilibrium. I wonder if uh the fact that emotions which were developed millions or in many cases billions of years ago in a
Dwarkesh Patel (Host) 02:30.100
totally different environment are still guiding our actions so strongly is an example of alignment success. To maybe spell out what I mean, the brain stem has these
Ilya Sutskever (Co-founder and Chief Scientist) 02:46.260
I don't know if it's more accurate to call it a value function or reward function. but the brain stem has a directive of it saying mate with somebody who's more successful. The cortex is the part that understands what does success mean in the modern context. But the brain stem
Ilya Sutskever (Co-founder and Chief Scientist) 02:59.020
is able to align the cortex and say however you recognize success to be and I I'm not smarter than to understand what that is. You're still going to pursue this directive. I think I think there is so I think there's a more general point. I think it's actually really mysterious
Ilya Sutskever (Co-founder and Chief Scientist) 03:14.940
how the brain encodes high level desires. Sorry, how evolution encompasses high level desires. Like it's pretty easy to understand how evolution would would endow us with the desire for food that smells good.
Dwarkesh Patel (Host) 03:30.140
Because smell is a chemical and so just pursue that chemical. It's very easy to imagine such a evolution doing such a thing. But evolution also has has endowed us with all these social desires. Like we we really care about being seen positively by society. if we care about being
Dwarkesh Patel (Host) 03:49.580
in a good standing, we like all these social intuitions that we have, I feel strongly that they're baked in and I don't know how evolution did it because it's a high level concept that's represented in the brain. Like what people think like let's say you are like you care about
Dwarkesh Patel (Host) 04:11.020
some social thing. It's not like a low level signal like smell. It's not something that for which there's a sensor. Like the brain needs to do a lot of processing to piece together lots of bits of information to understand what's going on socially and somehow evolution said
Dwarkesh Patel (Host) 04:28.860
that's what you should care about. Yes. How did it do it. And he did it quickly too. Yeah. Because I think all these sophisticated social things that um we care about, I think they evolved pretty recently. Yeah. So evolution had an easy time hardcore in this high level desire.
Dwarkesh Patel (Host) 04:44.520
And I maintain or, you know, at least I'll say, I'm
Dwarkesh Patel (Host) 04:48.280
unaware of good hypothesis for how it's done. I I had some ideas I was kicking around, but none of them none of them uh are satisfying. Yeah. And what's especially impressive if it was a desire that you learned in your lifetime, it kind of makes sense cuz your brain is
Dwarkesh Patel (Host) 05:07.040
intelligent, it makes sense why you'd be able to learn