Solutions
- Description
- Reviews
- Citation
- Cataloging
- Transcript
If you are not affiliated with a college or university, and are interested in watching this film, please register as an individual and login to rent this film. Already registered? Login to rent this film. This film is also available on our home streaming platform, OVID.tv.
In the desert of New Mexico, a group of scientists, entrepreneurs and innovators come together with an ambitious goal: to secure the future of humanity by creating a new and optimistic pathway into the future. Their expertise is diverse – from science and technology to economics, social studies and the business world.
Focusing on the rising gap between rapidly evolving 'physical' technologies (like artificial intelligence and automation) on the one hand, and our slowly responding cultural systems (government, education) on the other, the group identifies some of the major side effects: rising populism, financial instability, income inequality, and the breakdown of a collective conception of objective facts – all against the backdrop of climate change. Without a change in direction, worldwide turmoil will continue to grow.
Solutions offers deeply inspiring insights into a new vision for humanity, with concrete ideas that will pave the way for solving some of the world's most challenging problems.
Citation
Main credits
Grønkjær, Pernille Rose (film director)
Grønkjær, Pernille Rose (screenwriter)
Grønkjær, Pernille Rose (film producer)
Sandholt, Per (screenwriter)
Other credits
Cinematography, Ben Bernhard; editing, Per Sandholt; music, Jonas Struck.
Distributor subjects
Climate Change; Economics; TechnologyKeywords
00:00:25.960 --> 00:00:28.120
For 99% of human history -
00:00:28.240 --> 00:00:31.200
- people believed the Earth was
the center of the solar system.
00:00:31.320 --> 00:00:34.520
They looked up in the sky
and saw what happened -
00:00:34.640 --> 00:00:38.000
- and it made perfect sense to them.
00:00:38.120 --> 00:00:43.400
And then 500 years ago, along comes
Copernicus and his buddy Galileo.
00:00:43.520 --> 00:00:47.480
They're like, "Nah, the sun is
the center of the solar system."
00:00:47.600 --> 00:00:52.400
And super interesting and very,
very analogous to our time is -
00:00:52.520 --> 00:00:56.880
- that they went to the powers that be
at the time and said, "Hey, guess what?"
00:00:57.000 --> 00:01:00.640
"The Earth isn't the center
of the solar system."
00:01:00.760 --> 00:01:03.360
Galileo didn't invent the telescope
but he popularized it.
00:01:03.480 --> 00:01:07.000
He's like, "Look, you can
see it with your own eyes."
00:01:07.120 --> 00:01:09.720
And the Catholic Church said to him:
00:01:09.840 --> 00:01:12.960
"You can stick that telescope
where the sun doesn't shine."
00:01:13.080 --> 00:01:16.360
And they stuck him in jail
for the rest of his life. Why?
00:01:16.480 --> 00:01:20.560
Because if Earth was diminished,
so were they.
00:01:20.680 --> 00:01:23.160
And the only fact they cared about -
00:01:23.280 --> 00:01:28.160
- was their status,
their privileges and their power.
00:01:41.200 --> 00:01:45.880
The 21st century will either be
our greatest century or our worst.
00:01:46.000 --> 00:01:49.600
That's going to depend on the choices
we make in the next couple of decades.
00:01:54.080 --> 00:01:56.640
It's time to wake up.
00:01:57.760 --> 00:02:01.600
We cannot not take action anymore -
00:02:01.720 --> 00:02:05.000
- because there are issues
with our climate -
00:02:05.120 --> 00:02:09.240
- conflicts because of globalization,
growing inequality.
00:02:09.360 --> 00:02:13.720
There are issues
with loss of our freedom.
00:02:13.840 --> 00:02:17.360
This is happening because we don't
have a narrative that makes sense.
00:02:37.440 --> 00:02:39.520
- We're rolling.
- You're rolling? Okay.
00:02:39.640 --> 00:02:41.520
- Rolling.
- All right.
00:02:43.680 --> 00:02:46.160
- How are you doing?
- I'm really enjoying this.
00:02:55.720 --> 00:02:58.440
- Where am I looking, at you?
- You're looking at me.
00:03:03.720 --> 00:03:07.120
I'm trying to think of the right way
to respond to that.
00:03:14.920 --> 00:03:20.040
One of the big challenges for
constructing a winning global narrative -
00:03:20.160 --> 00:03:24.840
- is to be able to identify some foe.
00:03:24.960 --> 00:03:31.240
And whether global threats
we face today are concrete enough -
00:03:31.360 --> 00:03:36.200
- to serve as the villain
in this story is an open question.
00:03:36.320 --> 00:03:40.160
So if we want to create a global "we" -
00:03:40.280 --> 00:03:43.440
- we should invent
an extraterrestrial alien.
00:03:43.560 --> 00:03:49.000
I always thought an alien invasion would
do wonders for helping us come together.
00:03:49.120 --> 00:03:52.320
An alien invasion would solve many of my
problems and disseminate Big History.
00:03:52.440 --> 00:03:56.200
An invasion by Martians would be
really helpful right now.
00:03:56.320 --> 00:04:00.520
The alternative might be
some kind of really bad virus.
00:04:00.640 --> 00:04:04.760
- A global pandemic.
- Because global warming isn't fast enough.
00:04:35.080 --> 00:04:38.800
A key part of this is getting
the right diagnosis.
00:04:38.920 --> 00:04:43.280
Until we get the right diagnosis of the
problem, it's hard to move to solutions.
00:04:45.320 --> 00:04:49.800
I think only if you look
at the history of humanity -
00:04:49.920 --> 00:04:52.640
- in the context
of the biosphere as a whole -
00:04:52.760 --> 00:04:56.840
- can you really grasp how strange
is the moment we're living in.
00:04:56.960 --> 00:05:01.560
Never in four billion years,
since life first began on this planet -
00:05:01.680 --> 00:05:06.600
- has a single species had
so much power collectively -
00:05:06.720 --> 00:05:10.320
- that they can change
the entire biosphere.
00:05:10.440 --> 00:05:13.640
Planet Earth is under new management.
00:05:16.960 --> 00:05:20.800
Given that we are still accelerating
our CO2 emissions -
00:05:20.920 --> 00:05:24.080
- at the end of the century,
the planet will be so warm -
00:05:24.200 --> 00:05:28.560
- that the effects of warming alone
will be formidable.
00:05:29.760 --> 00:05:32.760
We need to figure out
how to get to a world -
00:05:32.880 --> 00:05:36.080
- which has energy
without greenhouse gas effects.
00:05:39.320 --> 00:05:42.640
The problem that brought
this group together -
00:05:42.760 --> 00:05:49.000
- is a deep sense that
we are in a time of dangerous turbulence -
00:05:49.120 --> 00:05:53.160
- and there is both in this country
and around the developing world -
00:05:53.280 --> 00:05:56.720
- a massive amount of
social and economic instability.
00:05:56.840 --> 00:06:00.240
We believe that foundationally
it's been caused -
00:06:00.360 --> 00:06:04.360
- by 40 years
of rising economic inequality.
00:06:05.440 --> 00:06:08.560
The American dream was,
I'm going to do better than my parents.
00:06:08.680 --> 00:06:11.280
Half now do worse than their parents.
00:06:11.400 --> 00:06:15.480
The rates of depression
and anxiety are overwhelming.
00:06:15.600 --> 00:06:18.840
The number of people
on antidepressants is scary.
00:06:18.960 --> 00:06:23.720
The number of kids with ADHD.
The rate of addiction to...
00:06:23.840 --> 00:06:28.560
I used to think addiction to Facebook was
a joke or a metaphor, but no, it's real.
00:06:30.160 --> 00:06:33.480
We have a situation
that we didn't use to have.
00:06:33.600 --> 00:06:37.720
70-80% of people get their news
through social media platforms.
00:06:37.840 --> 00:06:41.600
The vast majority of people are
still consuming the mainstream media.
00:06:41.720 --> 00:06:45.120
But they're also consuming, depending on
whether you're on the left or the right -
00:06:45.240 --> 00:06:49.240
- this massive amount of propaganda.
And, of course, propaganda works.
00:06:50.560 --> 00:06:54.600
We've seen disinformation spread
like wildfire on social media.
00:06:54.720 --> 00:06:56.800
It's a huge problem in a society -
00:06:56.920 --> 00:07:00.240
- if you have entire sectors
of the population -
00:07:00.360 --> 00:07:03.960
- who believe fundamentally different
facts about the state of the world.
00:07:04.080 --> 00:07:10.480
You cannot have a democratic government
when your citizenry is as twisted -
00:07:10.600 --> 00:07:14.160
- because that's what propaganda does,
as ours is.
00:07:15.400 --> 00:07:21.120
I actually believe... we're not going
to win the war for democracy.
00:07:22.920 --> 00:07:27.080
Maybe there's a 3% chance we win,
but if you told me -
00:07:27.200 --> 00:07:31.160
- my son had brain cancer and there was
a 3% chance that we could beat it -
00:07:31.280 --> 00:07:34.720
- fine, I'm going to give
every bit of energy I can to beat it.
00:07:34.840 --> 00:07:36.760
That's how I feel about this.
00:07:39.120 --> 00:07:41.920
Democracies are sustained
by social cohesion -
00:07:42.040 --> 00:07:46.680
- and social cohesion is sustained
by reciprocity norms.
00:07:46.800 --> 00:07:53.080
And if you build an economy that
systematically impoverishes most people -
00:07:53.200 --> 00:07:57.960
- and puts most people
in a state of precariousness -
00:07:58.080 --> 00:08:02.320
- and insecurity and worry -
00:08:02.440 --> 00:08:06.200
- you destroy their capacity
to invest in themselves -
00:08:06.320 --> 00:08:10.000
- to take risks, to think
about the future, to be happy.
00:08:10.120 --> 00:08:15.240
And when you do that,
you get massive political polarization -
00:08:15.360 --> 00:08:19.360
- cynicism, anger, racism, xenophobia -
00:08:19.480 --> 00:08:24.200
- and, you know, that's what happens
before the end.
00:08:31.360 --> 00:08:35.240
Any aspect of our society is
increasingly being penetrated -
00:08:35.360 --> 00:08:37.920
- by different kinds of technology -
00:08:38.040 --> 00:08:42.800
- and as this process is moving
faster and faster -
00:08:42.920 --> 00:08:47.880
- virtually nobody understands science
and engineering in the parliaments.
00:08:48.000 --> 00:08:55.160
And this is a toxic mixture
of incompetence and power.
00:08:57.360 --> 00:09:02.240
Today, we can see clearly the formation
of a critical infrastructure -
00:09:02.360 --> 00:09:07.200
- that encompasses all communication,
all data storage -
00:09:07.320 --> 00:09:12.160
- all financial transactions,
certainly all digital manufacturing -
00:09:12.280 --> 00:09:16.520
- increasingly everything that has
to do with controlling transportation -
00:09:16.640 --> 00:09:21.840
- energy, all our health data.
It influences every aspect of our life.
00:09:21.960 --> 00:09:25.000
We as humans have
to engage in figuring out -
00:09:25.120 --> 00:09:29.000
- how the heck do we make
a good governance structure of that -
00:09:29.120 --> 00:09:33.720
- because whoever's in control of that
will have complete control of us.
00:09:34.640 --> 00:09:39.160
I will assert that our educational system
at all levels is badly broken.
00:09:39.280 --> 00:09:44.960
We aren't doing due service to the younger
generations that we're trying to train.
00:09:45.080 --> 00:09:49.360
Because of the speed that
everything works at, we're now focused -
00:09:49.480 --> 00:09:54.760
- on the next quarter, the next minute,
the next Facebook like -
00:09:54.880 --> 00:09:57.480
- the next election.
00:09:57.600 --> 00:10:01.760
And instead of using big data
to predict the future -
00:10:01.880 --> 00:10:04.600
- we're using big data to do day trading -
00:10:04.720 --> 00:10:09.680
- and we're spending and trading
rather than investing in our future.
00:10:09.800 --> 00:10:14.400
You see that in our infrastructure, our
education system and how we do health.
00:10:14.520 --> 00:10:20.320
It's all short-term, and we've discounted
the future so dramatically.
00:10:21.480 --> 00:10:26.360
Our ability to manipulate the world
has skyrocketed decade by decade.
00:10:26.480 --> 00:10:31.280
It's creating the problems we're seeing.
Global warming being the obvious example.
00:10:31.400 --> 00:10:36.160
I worry most about climate change
because there's a clock ticking on that.
00:10:36.960 --> 00:10:42.320
Unless we think globally,
my children are going to suffer.
00:10:42.440 --> 00:10:46.080
So that's the argument that we need -
00:10:46.200 --> 00:10:49.600
- to understand how to communicate.
In a nutshell.
00:10:50.680 --> 00:10:56.760
We have to believe
that we humans are capable -
00:10:56.880 --> 00:11:00.640
- of generating the knowledge
and the skills needed -
00:11:00.760 --> 00:11:07.440
- to steer us towards
a more stable and sustainable -
00:11:07.560 --> 00:11:11.000
- and civilized future
in which people can live well.
00:11:11.120 --> 00:11:15.640
I have this nightmare sometimes.
You wake up, someone says:
00:11:15.760 --> 00:11:20.560
"Captain, are you ready to land?
We're near to LAX Airport."
00:11:20.680 --> 00:11:26.280
And I wake up and I look
at dials and screens on a jumbo jet.
00:11:26.400 --> 00:11:29.080
And I have no idea.
00:11:29.200 --> 00:11:33.600
We have to learn quickly
if we're to land this thing.
00:12:22.440 --> 00:12:24.880
We have people from
all over the world here.
00:12:25.000 --> 00:12:29.000
We all agree
that there's something big going on.
00:12:29.120 --> 00:12:33.240
I see my role as a scientist as it's not
my job to impose my values on others -
00:12:33.360 --> 00:12:38.200
- but it is my job to connect
the present to the future.
00:12:38.320 --> 00:12:44.200
I can tell somebody, "If you make these
choices, this is how the world will look."
00:12:44.320 --> 00:12:47.480
If they get an accurate picture
of how that world would look -
00:12:47.600 --> 00:12:51.400
- their values may help them
make better decisions.
00:12:55.600 --> 00:12:59.320
People make a lot of bad choices,
because they believe -
00:12:59.440 --> 00:13:03.720
- their values select something like
global warming is not a problem.
00:13:03.840 --> 00:13:07.760
Well, that's not a question of values.
That's a question of a misunderstanding -
00:13:07.880 --> 00:13:10.840
- about the relationship
between the present and the future.
00:13:10.960 --> 00:13:15.240
Because the world in which we just pump
CO2 in the atmosphere ad infinitum -
00:13:15.360 --> 00:13:19.160
- is going to be a world
that nobody's going to like.
00:13:21.160 --> 00:13:24.840
We're here mainly to understand
what can we change -
00:13:24.960 --> 00:13:27.560
- what kind of knobs can we turn -
00:13:27.680 --> 00:13:33.080
- so that we prevent disaster
based on fundamental values.
00:13:33.200 --> 00:13:37.000
Like I would love to have my grandchildren
to be able to live free and happy lives.
00:13:41.560 --> 00:13:46.000
We need a fundamentally new conception
based on science and empirical reality -
00:13:46.120 --> 00:13:50.520
- of human behavior in human institutions
and culture and in interdisciplinary -
00:13:50.640 --> 00:13:55.200
- bringing together the best ideas and
empirical facts into a coherent theory.
00:13:59.880 --> 00:14:05.280
As scientists, we must provide some
of these good sustainable trajectories -
00:14:05.400 --> 00:14:07.160
- because who else should do it?
00:14:07.280 --> 00:14:11.800
Natural scientist can't do it alone.
It has to be a diverse group of people.
00:14:11.920 --> 00:14:15.160
That's why we have
this diverse group of people around.
00:14:15.280 --> 00:14:19.720
A clearer conversation is needed
about the sort of world we'd like to see.
00:14:19.840 --> 00:14:22.720
I think that conversation
has to be global.
00:14:22.840 --> 00:14:26.160
It must include
Russians and Chinese.
00:14:26.280 --> 00:14:29.760
It must include
people in small-scale societies.
00:14:33.600 --> 00:14:37.200
I love conversations across disciplines.
00:14:37.320 --> 00:14:41.800
They're fairly rare, and this one is
across lots of disciplines.
00:14:41.920 --> 00:14:47.720
What I'd love to see come out of this is
we uncover more of these possibilities -
00:14:47.840 --> 00:14:52.240
- because by just putting out
some possibilities -
00:14:52.360 --> 00:14:55.880
- you have a different idea
of measuring it as access to solutions.
00:14:56.000 --> 00:15:02.000
What could we feasibly dream of,
that's within our reach right now?
00:15:03.000 --> 00:15:06.960
We're not done inventing.
Is this the best society we could have?
00:15:11.680 --> 00:15:17.160
This is a place where
I can both learn about and anchor ideas -
00:15:17.280 --> 00:15:20.600
- that I can then go
extend into the world -
00:15:20.720 --> 00:15:25.120
- and either knock down bad ideas
or build new ideas, right?
00:15:25.240 --> 00:15:28.760
Which is how you create
social change in the world.
00:15:31.440 --> 00:15:33.480
The system isn't working -
00:15:33.600 --> 00:15:37.320
- because it was built
for a clock speed that's 100 years old.
00:15:37.440 --> 00:15:40.000
Now the clock speed is just faster -
00:15:40.120 --> 00:15:45.240
- and that may imply political reforms
that people have not yet considered.
00:15:45.360 --> 00:15:50.440
When you talk about these things, people
get engaged, because we're all concerned.
00:15:50.560 --> 00:15:55.720
Yet people are thinking about issues that
bear on us from different perspectives.
00:15:55.840 --> 00:15:59.720
Why don't we have some time in the future
a world technology organization -
00:15:59.840 --> 00:16:06.120
- that brings all this research,
the movements ... It gives it a body.
00:16:09.960 --> 00:16:12.920
This group is not going to drive
a social change movement -
00:16:13.040 --> 00:16:16.440
- but could help inform
a social change movement.
00:16:16.560 --> 00:16:22.200
It's like a fire hose of ideas
and you just get blasted.
00:16:22.320 --> 00:16:24.640
You come out so much richer
but also exhausted.
00:16:24.760 --> 00:16:27.880
But that's fine.
What a great way to get exhausted.
00:16:29.320 --> 00:16:34.000
What is the vision and what do we want
in our society going forward?
00:16:34.120 --> 00:16:39.360
The Chinese are successful in terms of
wealth, power, status and leadership -
00:16:39.480 --> 00:16:41.840
- because they're pursuing
that clear goal -
00:16:41.960 --> 00:16:46.320
- and lining up resources and values
to go after that goal.
00:16:46.440 --> 00:16:49.640
Very different than other countries,
including the West.
00:16:53.240 --> 00:16:58.760
I feel like we have no vision and I admire
the Chinese for having such a vision -
00:16:58.880 --> 00:17:04.080
- and knowing and articulating so clearly
what their plans are to get there.
00:17:05.200 --> 00:17:10.080
The success of the West
in the last 250 years has been -
00:17:10.200 --> 00:17:15.560
- expanding the cooperation of strangers,
and identity has been a key part of that.
00:17:15.680 --> 00:17:19.680
A big part of the success of the American
model was creating an American identity.
00:17:19.800 --> 00:17:24.920
The circle of identity and cooperation
is now collapsing back to older forms.
00:17:25.040 --> 00:17:28.480
The fundamental question is how
do we get the circle expanding again?
00:17:28.600 --> 00:17:33.280
I lived in Canada 10 years and I became
Canadian, and when we took the oath -
00:17:33.400 --> 00:17:38.920
- the judge asked us to keep our
identities, in my case, my Greek identity.
00:17:39.040 --> 00:17:44.760
The identity of being Canadian is to be
proud of everybody's different identity.
00:17:44.880 --> 00:17:50.000
But I always felt this is because they're
a rich country with no obvious enemies.
00:17:54.240 --> 00:17:57.760
Having multiple identities is great,
but it's probably something you enable -
00:17:57.880 --> 00:18:01.240
- rather than something
you can start with.
00:18:01.360 --> 00:18:07.280
In schools today, we overwhelmingly teach
people about their national identity -
00:18:07.400 --> 00:18:11.920
- and I think that's dangerous
because it smuggles in the idea -
00:18:12.040 --> 00:18:15.520
- that your fundamental loyalty
is sort of tribal.
00:18:15.640 --> 00:18:19.200
In a world with nuclear weapons
and problems like climate change -
00:18:19.320 --> 00:18:22.560
- that have to be solved by
collaboration across many nations -
00:18:22.680 --> 00:18:26.800
- I don't think that's the identity
that should dominate.
00:18:26.920 --> 00:18:30.400
I know of nowhere in the world where
courses are taught in human histories.
00:18:30.520 --> 00:18:34.960
If we keep teaching about national
histories, we just guarantee more wars.
00:18:35.080 --> 00:18:37.600
The internal values of people are
radically different -
00:18:37.720 --> 00:18:41.720
- in how they want to organize the world,
so at the global governance level -
00:18:41.840 --> 00:18:45.240
- there's very little that the global
government should worry about.
00:18:45.360 --> 00:18:47.600
Maybe you should worry
about peace alone.
00:18:47.720 --> 00:18:50.800
- There shall not be war. That's enough.
- Good start.
00:18:55.120 --> 00:19:01.200
But our reigning institutions weaken our
confidence in global governance right now.
00:19:02.080 --> 00:19:05.520
There's a meta question
about who gets to decide what's good.
00:19:05.640 --> 00:19:10.520
One of my irreducible goals is
that I want to live in a society -
00:19:10.640 --> 00:19:15.400
- where the majority of citizens get
to decide what's good for them.
00:19:15.520 --> 00:19:21.520
Not a few people at the very tippy top
deciding what's good for them.
00:19:21.640 --> 00:19:24.840
The answers that the majority will come to
will be imperfect -
00:19:24.960 --> 00:19:27.960
- but they will almost certainly
beat the shit out of the, you know.
00:20:12.920 --> 00:20:16.360
There are three big dates
in the history of the biosphere.
00:20:16.480 --> 00:20:18.720
First is the origin of life.
00:20:18.840 --> 00:20:22.440
The second is multicellular life,
and the third is right now.
00:20:24.600 --> 00:20:27.680
This is a transformative moment
on a scale of four billion years.
00:20:27.800 --> 00:20:30.480
Now, if I'm right, you cannot see that -
00:20:30.600 --> 00:20:34.160
- unless you're willing to contemplate
a scale of four billion years.
00:20:44.000 --> 00:20:49.240
Origin stories help you
place yourself in the universe.
00:20:49.360 --> 00:20:53.480
They tell you how you came to be,
how your people came to be -
00:20:53.600 --> 00:20:58.320
- how the Earth came to be, how life
came to be, how the stars came to be.
00:20:58.440 --> 00:21:02.880
It's a sort of mapping.
It allows you to say:
00:21:03.000 --> 00:21:06.440
This is who I am.
This is when I live.
00:21:06.560 --> 00:21:12.280
This is where I live. This is
the sort of universe that I live in.
00:21:12.400 --> 00:21:16.880
And that's a very powerful way
of thinking about who you are.
00:21:17.000 --> 00:21:19.080
And as with any mapping -
00:21:19.200 --> 00:21:24.360
- if the mapping is good,
then it will teach you what is possible.
00:21:40.040 --> 00:21:43.680
We're looking at
an astonishing acceleration -
00:21:43.800 --> 00:21:46.480
- in a process
that's 200,000 years old.
00:21:46.600 --> 00:21:51.080
To fully understand it,
you need to understand the phases -
00:21:51.200 --> 00:21:53.400
- that this story has gone through.
00:21:53.520 --> 00:21:55.960
I go right back to the beginning -
00:21:56.080 --> 00:22:00.680
- with a species that's capable of
exchanging information so efficiently -
00:22:00.800 --> 00:22:03.040
- that that information begins
to accumulate.
00:22:03.160 --> 00:22:08.480
And then also in this long story, what
you see is what I call mega innovations.
00:22:08.600 --> 00:22:12.160
Agriculture is one.
That allows populations to grow.
00:22:12.280 --> 00:22:17.000
Population growth requires
shifts in social structures.
00:22:17.120 --> 00:22:20.000
Shifts that give us
traditional civilization.
00:22:20.120 --> 00:22:24.200
The industrial revolution
was one more mega innovation.
00:22:24.320 --> 00:22:27.680
A globalizing world
encouraged innovation -
00:22:27.800 --> 00:22:31.640
- that took us across
this fundamental technological divide -
00:22:31.760 --> 00:22:35.840
- where we got access not just to
the energy of recent photosynthesis -
00:22:35.960 --> 00:22:37.840
- which is what agriculture did -
00:22:37.960 --> 00:22:42.760
- but to the stored results of
photosynthesis over 300 million years.
00:22:48.240 --> 00:22:52.640
If you mentally extend this graph
of total human energy use -
00:22:52.760 --> 00:22:55.920
- it's about 50 times to the left.
00:22:56.040 --> 00:22:59.360
- What's the starting date on your graph?
- It's 1800.
00:22:59.480 --> 00:23:03.720
So something like 50 times will take you
back to the end of the last ice age.
00:23:03.840 --> 00:23:08.040
If you imagine that graph, then
this spike of total human energy use -
00:23:08.160 --> 00:23:10.160
- how astonishing this spike is.
00:23:10.280 --> 00:23:14.920
I don't think you can understand
modernity without seeing -
00:23:15.040 --> 00:23:20.520
- that there's been this staggering
bonanza of virtually free energy -
00:23:20.640 --> 00:23:24.200
- which has driven
so much of modern change.
00:23:35.480 --> 00:23:38.400
Modernity has created very good lives -
00:23:38.520 --> 00:23:42.560
- which we often take for granted,
but this is a very new thing.
00:23:42.680 --> 00:23:47.960
Thinking about the future, you don't
decide: Am I optimistic or pessimistic?
00:23:48.080 --> 00:23:53.520
Optimism is a kind of necessary part
of the equipment we need -
00:23:53.640 --> 00:23:56.640
- if we're to solve these problems.
00:23:56.760 --> 00:24:02.720
I think there are serious grounds
for optimism about a nonrevolutionary -
00:24:02.840 --> 00:24:06.720
- non-catastrophic resolution
of these tensions -
00:24:06.840 --> 00:24:12.120
- many of which arose simply because of
the sheer speed of technological change.
00:24:12.240 --> 00:24:17.200
We know what the challenge is.
We have to assume there's a good outcome.
00:24:17.320 --> 00:24:22.480
There are no guarantees. There may be
costs. That's the attitude we must have.
00:24:56.760 --> 00:25:02.000
We just assume that democracy is
the best thing there is, end of story.
00:25:02.120 --> 00:25:06.640
I'm questioning something
that may be politically incorrect.
00:25:06.760 --> 00:25:11.920
Can democracy work in this epoch
we are entering?
00:25:26.200 --> 00:25:29.720
If we want to have
an enlightened democracy -
00:25:29.840 --> 00:25:35.720
- how can we force people
to understand more -
00:25:35.840 --> 00:25:39.360
- if our survival depends on it?
00:25:48.640 --> 00:25:51.840
This is an open question.
What do you think about this?
00:25:51.960 --> 00:25:57.080
I think altering democracy is one of the
key things we should be thinking about -
00:25:57.200 --> 00:26:01.360
- and how can we tune democracy
to make it work better.
00:26:01.480 --> 00:26:06.360
So I agree, Steen, with your skepticism.
If our future depends -
00:26:06.480 --> 00:26:10.320
- on people getting a deeper scientific
understanding of climate change -
00:26:10.440 --> 00:26:14.400
- and acting rationally
on that information, we are screwed.
00:26:14.520 --> 00:26:18.120
But what history tells us does work is -
00:26:18.240 --> 00:26:22.200
- over time, populations can
adopt new social norms.
00:26:22.320 --> 00:26:26.800
One of the successes of democracies,
including the British one, which is old -
00:26:26.920 --> 00:26:32.080
- is it widely wired into society
those norms and rules of the road -
00:26:32.200 --> 00:26:35.800
- which people don't think much about -
00:26:35.920 --> 00:26:39.840
- but get transmitted
generation to generation and act on.
00:26:49.400 --> 00:26:53.520
Are there a new set of norms and rules
that could be transmitted -
00:26:53.640 --> 00:26:56.520
- through these cultural mechanisms
that we know work?
00:26:56.640 --> 00:27:01.320
And could that be done in a time scale
that matters in the challenges we face?
00:27:01.440 --> 00:27:05.520
But there's also a harder scientific
question of what would those be?
00:27:10.520 --> 00:27:15.800
What is this notion of whether democracy
is no longer functioning built on?
00:27:15.920 --> 00:27:22.480
In some ways Brexit and Trump are showing
that democracy is actually functioning -
00:27:22.600 --> 00:27:27.360
- because a lot of people haven't
had their voice heard for a long time -
00:27:27.480 --> 00:27:30.000
- and Trump and Brexit reflects that.
00:27:32.840 --> 00:27:36.880
If people are making choices
that are producing outcomes -
00:27:37.000 --> 00:27:40.640
- that will serve them in the future,
then democracy's functioning.
00:27:40.760 --> 00:27:44.880
But if they're making choices that,
for example, are based on lies -
00:27:45.000 --> 00:27:50.080
- they're not producing outcomes that
serve the very people that are voting.
00:27:52.600 --> 00:27:56.440
What's the empirical evidence
that democracy is broken?
00:27:56.560 --> 00:28:00.480
I offer the following:
When you do surveys of people -
00:28:00.600 --> 00:28:04.200
- and what their preferences are
on a host of issues from -
00:28:04.320 --> 00:28:07.800
- in the US, climate change, guns,
in other countries, other issues -
00:28:07.920 --> 00:28:12.120
- there is evidence that there's been
a widening gap between those things.
00:28:12.240 --> 00:28:16.800
You have very large majorities for certain
positions on political issues in the US -
00:28:16.920 --> 00:28:19.200
- that aren't being achieved
through politics.
00:28:19.320 --> 00:28:24.480
82% of the US population supports
some basic restrictions on guns.
00:28:24.600 --> 00:28:26.920
There is 0% chance of that happening.
00:28:27.040 --> 00:28:31.560
That's a huge disconnect between broad
social preferences and political outcomes.
00:28:31.680 --> 00:28:36.160
So we've seen a system that's
become more disconnected from reality.
00:28:56.560 --> 00:29:00.000
There are many ways
in which corporations trump humanity.
00:29:00.120 --> 00:29:04.800
They can be more rational than humanity
can on all sorts of dimensions.
00:29:04.920 --> 00:29:07.320
They can maximize in a more efficient way.
00:29:08.680 --> 00:29:12.080
One scary thing
about where we are right now is -
00:29:12.200 --> 00:29:18.200
- that this superior form of intelligence
has already found a way -
00:29:18.320 --> 00:29:22.600
- to capture the only mode
that we humans have -
00:29:22.720 --> 00:29:26.240
- to think reflectively
about what our future should be.
00:29:26.360 --> 00:29:31.000
Government has been captured
by corporations everywhere -
00:29:31.120 --> 00:29:34.480
- except maybe in the Nordic countries.
I don't want to get into that.
00:29:34.600 --> 00:29:39.840
The point is, I think there's an urgency
right now if humanity is to prevail.
00:29:39.960 --> 00:29:43.200
As a member of humanity,
I'm all for us winning.
00:29:43.320 --> 00:29:45.680
But if we're going to have a fair fight -
00:29:45.800 --> 00:29:51.200
- how do we reclaim the ability
to self-govern becomes really critical.
00:30:10.000 --> 00:30:14.000
People would have said in 1948,
what the world needs now is democracy.
00:30:14.120 --> 00:30:18.080
If we just spread democracy everywhere,
happiness will spread everywhere -
00:30:18.200 --> 00:30:21.280
- and flourishing and prosperity
will happen everywhere.
00:30:21.400 --> 00:30:25.840
And indeed, if you look
at the history of the world since 1948 -
00:30:25.960 --> 00:30:28.960
- there's been
an extraordinary explosion -
00:30:29.080 --> 00:30:33.040
- in the presumptiveness
of "democracy" everywhere.
00:30:33.160 --> 00:30:38.320
But if you ask, in fact,
whether the people rule -
00:30:38.440 --> 00:30:42.920
- the answer is at least from the
perception of the people, not so much.
00:30:43.040 --> 00:30:49.520
At this moment, global dissatisfaction
with democracy has never been higher.
00:30:49.640 --> 00:30:51.760
What scares me the most -
00:30:51.880 --> 00:30:57.400
- is the growing skepticism among
the elite in the idea of democracy.
00:30:58.800 --> 00:31:03.720
A common question I hear in many different
contexts: What's wrong with the people?
00:31:03.840 --> 00:31:06.520
How could the people be so stupid?
00:31:08.560 --> 00:31:12.160
My view is that this sneer
is obtuse and oblivious -
00:31:12.280 --> 00:31:16.360
- and it misses a critical truth
about why we are here -
00:31:16.480 --> 00:31:19.640
- and the urgent need to fix it.
00:31:21.560 --> 00:31:27.360
And I want to focus on a change
that has been triggered by technology.
00:31:27.480 --> 00:31:29.960
Broadcasting was the emergence
of a technology -
00:31:30.080 --> 00:31:35.080
- that enabled many to hear
at one time the same thing.
00:31:37.240 --> 00:31:42.440
We hear the people through polling just
when they have something smart to say.
00:31:42.560 --> 00:31:45.720
They had something smart to say
because we knew a common set of facts -
00:31:45.840 --> 00:31:49.360
- and we knew a common set of facts
because of a platform of communication -
00:31:49.480 --> 00:31:52.120
- which forced us to know those facts.
00:31:53.440 --> 00:31:56.240
We're now in the stage after.
00:32:02.120 --> 00:32:06.880
The idea of broadcast is gone. Radical
fragmentation of attention happens.
00:32:07.000 --> 00:32:12.960
We produce many sources of information,
leading to this inevitable fragmentation.
00:32:13.080 --> 00:32:17.240
Those who were forced before
to watch the news, for example -
00:32:17.360 --> 00:32:21.400
- because that's the only thing that was
on between 6 and 7, aren't forced anymore.
00:32:21.520 --> 00:32:25.160
They can watch
whatever they want to watch.
00:32:32.600 --> 00:32:38.560
So I think the internet has given us
incredible diversity in culture -
00:32:38.680 --> 00:32:43.120
- that is to be celebrated and trumps
everything that happened before -
00:32:43.240 --> 00:32:46.760
- but it's a terrible thing
for common knowledge -
00:32:46.880 --> 00:32:50.480
- and therefore
a terrible thing for democracy.
00:32:54.440 --> 00:32:59.960
And we're never going back. We will always
live as human culture has always lived -
00:33:00.080 --> 00:33:03.960
- except for the 20th century,
in this really deeply fragmented world -
00:33:04.080 --> 00:33:07.920
- of all of us knowing different things
at the same time.
00:33:15.240 --> 00:33:18.800
What's the response?
There's an anti-democracy response:
00:33:18.920 --> 00:33:22.760
The people are absurd,
so let's find a way to ignore the people.
00:33:22.880 --> 00:33:27.560
Then there's a pro-democracy response,
which imagines us going back -
00:33:27.680 --> 00:33:32.680
- to the kind of common sphere
where we all understand everything.
00:33:32.800 --> 00:33:37.640
These responses both presume
what democracy needs is all of us -
00:33:37.760 --> 00:33:42.600
- to be functioning
at a high level at a particular time.
00:33:42.720 --> 00:33:47.120
We need to find a way to give up on
that assumption and to recognize instead -
00:33:47.240 --> 00:33:52.160
- that "we" is always built
and we must always defend -
00:33:52.280 --> 00:33:56.360
- how the "we" that we are paying
attention to is constructed.
00:34:02.080 --> 00:34:04.280
Think about the presidency:
00:34:04.400 --> 00:34:09.280
When we imagine the president speaking,
we don't imagine him tweeting.
00:34:09.400 --> 00:34:12.680
For the president to speak is for
the president to speak at the end -
00:34:12.800 --> 00:34:16.400
- of a very elaborate process
where he's been informed.
00:34:16.520 --> 00:34:20.120
In America, it's unfortunately only ever
been "he", but he's been informed -
00:34:20.240 --> 00:34:24.200
- and he is reflected
and has a chance to present his views -
00:34:24.320 --> 00:34:29.160
- and that is the view of the president.
And in that process, he gets staff.
00:34:30.680 --> 00:34:35.360
So why not us?
Because we the people have views -
00:34:35.480 --> 00:34:38.440
- but our views aren't constructed,
our views are found.
00:34:38.560 --> 00:34:42.960
Somebody calls us up during dinner:
"What do you think about NAFTA?"
00:34:43.080 --> 00:34:47.600
Or "Should we have thorium reactors
that replace natural gas?"
00:34:47.720 --> 00:34:51.160
So we give our views,
and those views get reflected -
00:34:51.280 --> 00:34:55.640
- however, or from
whomever they are collected.
00:35:02.200 --> 00:35:05.960
We say, "71% of us believe
we should go to war in Iraq" -
00:35:06.080 --> 00:35:11.720
- but those views are not informed.
They're uninformed, differently informed.
00:35:11.840 --> 00:35:16.000
And in this process,
nobody helps us understand anything.
00:35:30.200 --> 00:35:36.240
I'm trying to encourage the idea
that we think about understanding "we" -
00:35:36.360 --> 00:35:41.840
- without "we" speaking
for all of us at the same time.
00:35:45.560 --> 00:35:49.040
Think, for example, of the idea of a jury.
00:35:49.160 --> 00:35:53.800
In the US, juries are a central part
of making important decisions.
00:35:53.920 --> 00:35:57.400
They decide whether
somebody gets executed or not.
00:35:57.520 --> 00:36:04.240
The idea of a jury, this collection of a
dozen people, is a highly regulated idea.
00:36:04.360 --> 00:36:08.600
There are rules about what
they're not allowed to hear and consider.
00:36:08.720 --> 00:36:13.120
If at any moment along that process
you did a poll of the jury and said:
00:36:13.240 --> 00:36:15.400
"Is she guilty or not?"
00:36:15.520 --> 00:36:20.800
"Yeah, nine of us think she's guilty."
That would not be the view of the jury.
00:36:20.920 --> 00:36:25.520
The view of the jury is the product
of this huge process.
00:36:25.640 --> 00:36:30.040
And once it's finished, it has
incredible normative significance -
00:36:30.160 --> 00:36:33.240
- life-and-death significance
in some contexts.
00:36:47.280 --> 00:36:50.360
The idea that has the most potential
is deliberative polls -
00:36:50.480 --> 00:36:52.480
- that Jim Fishkin has advanced.
00:36:52.600 --> 00:36:57.360
Deliberative polls are the idea
of regular polls plus something.
00:36:57.480 --> 00:37:02.840
You take a random representative selection
of a public, but then you inform them -
00:37:02.960 --> 00:37:06.240
- and give them an opportunity
to deliberate in small and large groups.
00:37:10.840 --> 00:37:17.920
This is a picture of Mongolia. This is a
random representative sample of Mongolia.
00:37:18.040 --> 00:37:20.200
Mongolia is the size of Western Europe -
00:37:20.320 --> 00:37:24.040
- so half these people
spent two nights on a bus -
00:37:24.160 --> 00:37:29.720
- to sit in the parliament and deliberate
about proposed constitutional reform.
00:37:29.840 --> 00:37:33.600
As a constitutional law professor,
I was deeply skeptical -
00:37:33.720 --> 00:37:38.200
- that these ordinary Mongolians
would be able to understand anything -
00:37:38.320 --> 00:37:43.240
- about the really complicated question
that they were being asked to address.
00:37:43.360 --> 00:37:47.200
But I was astonished
with the incredible sophistication -
00:37:47.320 --> 00:37:50.840
- and common wisdom that evolved
over these two days.
00:37:50.960 --> 00:37:52.960
The consequence of this poll -
00:37:53.080 --> 00:37:56.360
- is something any constitutional
professor would be proud of.
00:37:58.280 --> 00:38:03.440
And it had a persuasive effect
on what the parliament ultimately did.
00:38:11.200 --> 00:38:15.840
The process involves bringing both sides
in, working through for months in advance:
00:38:15.960 --> 00:38:20.560
What information does the public need
to understand both sides of the issue -
00:38:20.680 --> 00:38:23.680
- and then developing
so that the public can understand it.
00:38:23.800 --> 00:38:28.440
It's essential that both sides believe
that their side has been represented well.
00:38:28.560 --> 00:38:32.760
You have to make sure that it's the right
mix of male, female, rich, poor -
00:38:32.880 --> 00:38:36.240
- professional, nonprofessional,
all the relevant demographic categories -
00:38:36.360 --> 00:38:39.560
- have to be represented in a random way.
00:38:41.160 --> 00:38:45.840
It seems that the image of these people
deliberating is people in a room.
00:38:45.960 --> 00:38:49.600
Is that part of an essential ingredient?
00:38:49.720 --> 00:38:53.600
One problem with online deliberation
is it's too voluntary.
00:38:53.720 --> 00:38:59.000
You can easily opt out or be isolated. The
deliberation contacts are important, too.
00:38:59.120 --> 00:39:04.120
If you allow people to deliberate
in their own spheres -
00:39:04.240 --> 00:39:08.160
- polarization increases.
Understanding decreases.
00:39:08.280 --> 00:39:12.640
Psychological literature says it has to be
in person because it's facial expressions.
00:39:12.760 --> 00:39:15.480
When you disagree and reach out
and touch somebody -
00:39:15.600 --> 00:39:18.320
- it's a very high bandwidth interaction.
00:39:18.440 --> 00:39:23.240
If this is something that we all agree on.
then we are really saying that -
00:39:23.360 --> 00:39:28.520
- some new technologies don't go together
with some essential aspects of democracy.
00:39:28.640 --> 00:39:31.040
Therefore, we do need to backtrack -
00:39:31.160 --> 00:39:34.960
- and find a place for old-style
techniques like people in a room.
00:39:44.040 --> 00:39:49.000
I do think that the next level of our
technological change is going to be -
00:39:49.120 --> 00:39:52.600
- when the technology itself
reflects what it means to be human.
00:39:52.720 --> 00:39:57.240
AI is a future form of self-governance -
00:39:57.360 --> 00:40:02.160
- where you collect data on people,
analyze it, and we vote with our behavior.
00:40:02.280 --> 00:40:09.040
So why aren't we saying that 3% of
deliberative representation happens -
00:40:09.160 --> 00:40:13.880
- and 97%, some AI and some
central decision-making by somebody -
00:40:14.000 --> 00:40:16.320
- is just what's going to happen?
00:40:16.440 --> 00:40:20.360
Predicting what will happen,
I would agree with you.
00:40:20.480 --> 00:40:25.880
Describing that in a neutral way,
I think is a terrible mistake -
00:40:26.000 --> 00:40:30.120
- because that future is
a terrible future.
00:40:30.240 --> 00:40:33.880
I can say,
what I want is environmental regulation -
00:40:34.000 --> 00:40:37.760
- but what I'm going to do is
what will actually benefit me.
00:40:37.880 --> 00:40:39.280
That's why we institute government.
00:40:39.400 --> 00:40:43.960
That depends on what you put in the AI
algorithm, which we might choose or not.
00:40:44.080 --> 00:40:49.000
But the "you" is the ambiguous statement,
because you're right that -
00:40:49.120 --> 00:40:54.200
- China will put one thing in the
algorithm, Singapore will put another -
00:40:54.320 --> 00:40:58.040
- and neoliberalism will put
a third thing in the algorithm.
00:40:58.160 --> 00:41:01.680
The question is whether we as citizens
feel responsibility for the thing -
00:41:01.800 --> 00:41:04.880
- we're putting into the algorithm.
00:41:05.000 --> 00:41:10.200
Fotini, you really managed
to push my button.
00:41:10.320 --> 00:41:14.680
I was surprised,
but I saw you were smiling -
00:41:14.800 --> 00:41:18.160
- so I know you said that
to get some ideas out.
00:41:18.280 --> 00:41:24.920
If you allow this infrastructure of
everything to monitor our behavior -
00:41:25.040 --> 00:41:29.120
- that means you lose your autonomy
and you lose your authority.
00:41:29.240 --> 00:41:33.400
So I was wondering, did you have
some other idea that I just didn't get?
00:41:33.520 --> 00:41:40.480
While we argue about privacy, things are
happening to billions of people.
00:41:40.600 --> 00:41:44.680
So I'm worried that we're sitting around
talking while it's happening.
00:41:44.800 --> 00:41:48.560
I'm actually not 100% convinced
it is a dystopia.
00:41:48.680 --> 00:41:50.520
I'm not saying that I disagree with you -
00:41:50.640 --> 00:41:56.080
- but I think there may be an evolution
of these values for a new world.
00:41:56.200 --> 00:42:01.720
And one way that you can help them
evolve is by challenging them.
00:42:01.840 --> 00:42:05.280
The Lord disagrees!
00:42:06.720 --> 00:42:09.800
Keep your freedom, Fotini!
00:43:03.440 --> 00:43:09.320
There are a number of alarming trends
identified by political scientists -
00:43:09.440 --> 00:43:12.560
- that we've seen in the US
and in democracies throughout the world.
00:43:12.680 --> 00:43:16.360
The first is the spread
of disinformation -
00:43:16.480 --> 00:43:19.400
- that has been used by malicious actors -
00:43:19.520 --> 00:43:25.120
- who want to undermine trust in
democratic institutions around the world.
00:43:25.240 --> 00:43:28.880
The second is polarization,
the fact that -
00:43:29.000 --> 00:43:32.760
- the left and the right
are moving apart over time -
00:43:32.880 --> 00:43:36.720
- both in terms of what they think
about issues, but also, more worryingly -
00:43:36.840 --> 00:43:39.000
- how they feel about the other side.
00:43:39.120 --> 00:43:41.600
There's this increase
in affective polarization -
00:43:41.720 --> 00:43:45.560
- where left increasingly hates the right,
doesn't want anything to do with them -
00:43:45.680 --> 00:43:51.520
- doesn't want their kid to marry somebody
whose family is right-wing and vice versa.
00:43:51.640 --> 00:43:54.520
The third is participation
in the public square.
00:43:54.640 --> 00:43:59.040
Democracy only really functions well
if all citizens are able to have a say -
00:43:59.160 --> 00:44:03.600
- and to discuss in public forums
how they feel about the issues -
00:44:03.720 --> 00:44:07.080
- and how they think
society should be run.
00:44:07.200 --> 00:44:12.200
There's a worry that digital outrage might
facilitate the spread of disinformation.
00:44:12.320 --> 00:44:17.680
It might accelerate polarization and limit
participation in the public square.
00:44:27.040 --> 00:44:32.720
It's actually amazing that we evolved to
feel outrage on behalf of other people -
00:44:32.840 --> 00:44:36.080
- but if it's an error, and I worry that -
00:44:36.200 --> 00:44:40.240
- these new technologies are increasing
the frequency of these errors -
00:44:40.360 --> 00:44:42.000
- the results can be hideous.
00:44:51.440 --> 00:44:55.440
Imagine if we were communicating,
having dinner, by each of us standing up -
00:44:55.560 --> 00:44:59.520
- whenever we wanted to and yelling at
everybody, "My steak is great, I love it."
00:44:59.640 --> 00:45:04.280
And then we sit down again
and everybody goes:
00:45:04.400 --> 00:45:06.000
"I like that. I like that, Johan."
00:45:07.160 --> 00:45:08.840
We don't communicate that way.
00:45:08.960 --> 00:45:12.200
If you look at how communication
takes place on Twitter and Facebook -
00:45:12.320 --> 00:45:17.160
- it's strange that we all collectively
got entranced by these environments -
00:45:17.280 --> 00:45:21.520
- and that we bought into
that paradigm of communicating.
00:45:29.520 --> 00:45:33.280
If you build an eight-lane asphalt road
right through the middle of town -
00:45:33.400 --> 00:45:37.280
- people will speed, and you could put up
little signs: "Oh, you shouldn't speed."
00:45:37.400 --> 00:45:41.280
You could put the cops there
and do all kinds of things -
00:45:41.400 --> 00:45:46.240
- but they will speed, because the road is
built for speed. That's just human nature.
00:45:46.360 --> 00:45:51.440
So instead of having the police and signs
and laws, just build a different road.
00:45:51.560 --> 00:45:56.560
If you know how people are, accommodate
it and make it better for everybody.
00:45:56.680 --> 00:46:01.160
Make the road more narrow,
add a speed bump, plant some trees.
00:46:01.280 --> 00:46:04.920
That's working with human nature
rather than ignoring it -
00:46:05.040 --> 00:46:09.320
- and then just trying
to kludge your way out of the problem.
00:46:11.200 --> 00:46:14.040
With social media, that's what we have.
00:46:14.160 --> 00:46:18.840
We had this idea we're going to connect
everybody. It's going to be great.
00:46:18.960 --> 00:46:24.240
Instead of sending an email to one person,
you can send it to 100,000 people at once.
00:46:24.360 --> 00:46:29.120
So, it's going to be "social media".
That's the idea.
00:46:29.240 --> 00:46:33.080
Everybody is their own
television station or radio station.
00:46:33.200 --> 00:46:36.520
But what they didn't fully account for,
I'm not pointing fingers -
00:46:36.640 --> 00:46:40.040
- because to me, that sounded
like a great idea as well -
00:46:40.160 --> 00:46:44.160
- but it might be that now
these technologies have run amok -
00:46:44.280 --> 00:46:48.680
- and have become almost abusive
to human nature.
00:46:54.560 --> 00:47:00.040
If moral outrage is a fire,
the internet might be like gasoline.
00:47:00.160 --> 00:47:01.840
There's good evidence now -
00:47:01.960 --> 00:47:06.600
- that moral emotional content spreads
faster and farther on social media.
00:47:06.720 --> 00:47:11.840
So to the extent that bad actors,
who want to spread disinformation -
00:47:11.960 --> 00:47:16.720
- can inject that disinformation with
outrage, it'll spread further and faster.
00:47:16.840 --> 00:47:21.240
Words like attack, bad, blame,
care, destroy...
00:47:21.360 --> 00:47:24.000
Every moral emotional word in a tweet -
00:47:24.120 --> 00:47:28.360
- increases its retweet likelihood
by 20 percent.
00:47:28.480 --> 00:47:31.000
These are massive effects.
00:47:32.840 --> 00:47:35.920
Humanity is a social species. We don't
acknowledge this often enough -
00:47:36.040 --> 00:47:40.520
- but you can simply not be happy
without social relationships.
00:47:40.640 --> 00:47:45.920
Increasingly, these social relationships
are being mediated by technology.
00:47:46.040 --> 00:47:51.800
1/7 of the world's population is a member
of Twitter, Facebook or the like.
00:47:51.920 --> 00:47:57.000
We should all be so much happier now
thanks to Facebook and Twitter.
00:47:57.120 --> 00:48:01.280
And is that really the case?
Empirical data suggests no.
00:48:01.400 --> 00:48:05.600
The smartphone generations are not
hanging out with friends.
00:48:05.720 --> 00:48:09.040
See the release of the iPhone in 2007?
It just drops off.
00:48:09.160 --> 00:48:11.840
Less sex, less likely to get enough sleep.
00:48:11.960 --> 00:48:15.720
This also contributes to mental health.
More likely to feel lonely.
00:48:15.840 --> 00:48:19.760
Facebook wants you to spend as
much time on their platform as you can -
00:48:19.880 --> 00:48:24.880
- but it's associated with a sharp decline
in subjective well-being in young adults.
00:48:28.080 --> 00:48:32.120
There are experimental data, where they
told some people not to use Facebook -
00:48:32.240 --> 00:48:35.200
- and another cohort
to use it to their liking -
00:48:35.320 --> 00:48:39.320
- and they observed sharp differences
in the quality of life.
00:48:39.440 --> 00:48:44.640
Any parameter of social and psychological
well-being is affected there.
00:48:44.760 --> 00:48:50.040
So this global brain and how the internet
is connecting people, that's good.
00:48:50.160 --> 00:48:54.680
But these environments also create
social and psychological dysfunction -
00:48:54.800 --> 00:48:58.680
- that might outweigh the good
that they bring about.
00:48:58.800 --> 00:49:02.920
What looked at Twitter network, we had
five million randomly chosen individuals -
00:49:03.040 --> 00:49:06.680
- and we harvested all their timelines.
Twitter allows you to do that.
00:49:06.800 --> 00:49:12.000
3,200 most recent tweets, which
gives you a couple of years of activity.
00:49:14.520 --> 00:49:18.600
You have a computer read a text and say
that's a positive text, that's a negative.
00:49:18.720 --> 00:49:23.040
Then you combine it all together
and produce a time series graph -
00:49:23.160 --> 00:49:27.320
- and say, "Oh, now people were very down,
and now everybody was really happy."
00:49:27.440 --> 00:49:31.880
So you could keep track of
how entire populations are feeling.
00:49:32.000 --> 00:49:37.280
We found that most people in these social
networks are surrounded by friends -
00:49:37.400 --> 00:49:39.880
- that are, on average, happier than them.
00:49:44.680 --> 00:49:47.680
We see these two very strong clusters.
00:49:47.800 --> 00:49:54.000
Unhappy people tend to be connected to
friends that on average are also unhappy.
00:49:54.120 --> 00:49:56.280
Think about this.
You're miserable.
00:49:56.400 --> 00:50:00.240
You're connected to miserable people,
and they are still...
00:50:00.360 --> 00:50:04.720
- Less miserable than you.
- On average.
00:50:04.840 --> 00:50:08.720
- So this is bad.
- Life is just shit.
00:50:13.760 --> 00:50:17.480
On the basis of my findings, I deleted
my Facebook and Twitter accounts.
00:50:17.600 --> 00:50:22.720
So I'm no longer in social media
and it's bliss. You should join me.
00:50:25.920 --> 00:50:30.760
The question is, how do we change
the business model, fix the algorithms -
00:50:30.880 --> 00:50:34.000
- so that we can minimize these effects?
00:50:34.120 --> 00:50:38.880
First I thought about limiting the
number of friends that you can have.
00:50:39.000 --> 00:50:41.600
Apply Dunbar's number and just say:
00:50:41.720 --> 00:50:45.640
"Your allotment is 150 because that's
about the most you can keep track of."
00:50:54.480 --> 00:50:58.480
This technological change has not only
created the problem, but created a need -
00:50:58.600 --> 00:51:01.600
- to rethink the fundamental norms,
not abandon free speech -
00:51:01.720 --> 00:51:07.040
- but rethink the way we think
about it against the new technology.
00:51:15.000 --> 00:51:18.240
What do we know about
the regime before the new technology?
00:51:18.360 --> 00:51:23.560
We know that there was curation,
and we were actually okay with that.
00:51:23.680 --> 00:51:28.240
Now, the reduction of the cost enables
every individual to get their speech out -
00:51:28.360 --> 00:51:33.160
- and the realization that with that comes
a cost that we still need to avoid.
00:51:33.280 --> 00:51:37.280
- How would you articulate that cost?
- It's the undermining of democracy.
00:51:37.400 --> 00:51:43.360
Through the ability of anybody
with any crazy idea to push it out there.
00:51:44.880 --> 00:51:50.680
We have the option of regulation, finding
some place to restore the curation -
00:51:50.800 --> 00:51:54.800
- and/or of increasing the difficulty
of getting to certain kinds of speech.
00:51:54.920 --> 00:51:58.000
We can think of strategies
that makes all of them -
00:51:58.120 --> 00:52:01.560
- of which government regulation
remains the least desirable -
00:52:01.680 --> 00:52:04.480
- and also the hardest
to persuade people about...
00:52:04.600 --> 00:52:09.360
We can attack at the point of technology.
The heart of this is the platforms.
00:52:11.760 --> 00:52:16.320
- So we should break them up.
- But is that splitting the Hydra's heads?
00:52:16.440 --> 00:52:20.640
There's different ways to do it.
One simple way would be to say...
00:52:20.760 --> 00:52:23.240
We're really just talking
Facebook and Twitter.
00:52:23.360 --> 00:52:26.840
You say, you cannot be both
a social network and a news feed.
00:52:26.960 --> 00:52:29.920
You have to break those
into two separate, unrelated businesses.
00:52:30.040 --> 00:52:36.040
So the news feed won't through the social
network pass them any piece of crap.
00:52:36.160 --> 00:52:39.440
They're going to have to choose
the things that they look for.
00:52:39.560 --> 00:52:44.880
Second thing that I'd do would be to say,
you just can't use advertising.
00:52:45.000 --> 00:52:47.800
That doesn't violate
current free speech norms -
00:52:47.920 --> 00:52:52.000
- and it would deprive one of the
chief incentives and growth developers -
00:52:52.120 --> 00:52:55.360
- for the kind of crazy speech
that we get.
00:52:55.480 --> 00:53:01.720
We also have to change the algorithms,
we need to inject serendipity -
00:53:01.840 --> 00:53:06.240
- because how we form ourselves as human
beings and how we learn about the world -
00:53:06.360 --> 00:53:10.840
- is to a great extent by chance
interactions with places and people.
00:53:10.960 --> 00:53:14.880
So we have to put that back
into cyberspace -
00:53:15.000 --> 00:53:18.400
- because the algorithms thrive
on making echo chambers.
00:53:18.520 --> 00:53:24.560
I agree. The way I think about it is
the platforms are already curating -
00:53:24.680 --> 00:53:28.960
- based on their assumptions
about what they think you want -
00:53:29.080 --> 00:53:32.120
- and as a result
are feeding you something.
00:53:32.240 --> 00:53:36.760
My idea is, they don't make those choices
for us. You have to make your own choices.
00:53:36.880 --> 00:53:41.520
I think it's a huge mistake not to impose
the same things with Google.
00:53:41.640 --> 00:53:45.520
If they want to be so big,
they must release their algorithms.
00:53:45.640 --> 00:53:48.840
We have to be able as individuals
to go in and say -
00:53:48.960 --> 00:53:54.320
- do we want to have a neutral view
or a view that is completely cut?
00:53:54.440 --> 00:53:58.600
Agreed. So Twitter, Google and Facebook
are the three big ones.
00:54:07.920 --> 00:54:10.280
Tech companies are waking up -
00:54:10.400 --> 00:54:14.560
- to the potentially harmful consequences
of their products -
00:54:14.680 --> 00:54:18.400
- but they are corporations,
whose primary function is -
00:54:18.520 --> 00:54:21.840
- to make money for their shareholders.
00:54:21.960 --> 00:54:27.080
And that's a structural fact
that is going to make it difficult -
00:54:27.200 --> 00:54:31.280
- if we researchers find out that
the problem is their business model.
00:54:31.400 --> 00:54:37.080
So there may be some
tough impasses ahead.
00:54:55.400 --> 00:55:00.200
It's a fantastic experiment, but we should
realize that it is an experiment.
00:55:00.320 --> 00:55:03.640
We've been experimenting with society.
00:55:03.760 --> 00:55:07.720
I'm grateful to Facebook and Twitter. In
a way, they've done something fantastic.
00:55:10.160 --> 00:55:15.160
But on the other hand,
I think we need to recognize that...
00:55:16.280 --> 00:55:22.120
Well... how should I put this?
I'm trying to choose my words carefully.
00:55:22.240 --> 00:55:24.280
But my impression is -
00:55:24.400 --> 00:55:27.240
- that we've all made a really big mistake.
00:56:39.320 --> 00:56:44.160
What's new is the scale,
the speed of all these problems.
00:56:44.280 --> 00:56:48.040
So, do we have the time needed -
00:56:48.160 --> 00:56:51.280
- to vet all the possible
unintended consequences -
00:56:51.400 --> 00:56:55.560
- or do some of these problems need
to be acted on very quickly?
00:56:55.680 --> 00:56:59.000
I'm thinking of climate change and
the weather patterns we've been seeing -
00:56:59.120 --> 00:57:03.760
- and data that's beginning to suggest
that the Paris Accords were -
00:57:03.880 --> 00:57:08.920
- probably too weak, if anything.
And yet we see very little action.
00:57:13.720 --> 00:57:15.520
This time, if we get it wrong -
00:57:15.640 --> 00:57:19.640
- it won't just be a matter of adjustment
locally, it'll be a global problem.
00:57:19.760 --> 00:57:25.400
So the challenge is to make sure that
the 195 countries in the Paris Accords -
00:57:25.520 --> 00:57:29.920
- actually do do something
about carbon change fairly quickly -
00:57:30.040 --> 00:57:34.560
- so that the next two or three
generations are not living -
00:57:34.680 --> 00:57:37.160
- in a world that is so hot -
00:57:37.280 --> 00:57:41.760
- that agriculture breaks down, new
diseases appear, cities are flooded etc.
00:57:47.840 --> 00:57:53.000
The first thing the CO2 does, among other
things, it traps the heat from the sun -
00:57:53.120 --> 00:57:57.320
- and as a consequence of that,
the planet is getting warmer.
00:57:57.440 --> 00:57:59.480
Historically speaking, it's a good thing.
00:57:59.600 --> 00:58:03.640
Without CO2 in the atmosphere
we'd be living on a snowball earth.
00:58:03.760 --> 00:58:07.640
So we need a certain amount of CO2
in the atmosphere.
00:58:07.760 --> 00:58:11.760
And we have wobbled between
180 ppm during the ice ages -
00:58:11.880 --> 00:58:17.720
- to 280 ppm when out of the ice ages.
And we have now pushed that up.
00:58:17.840 --> 00:58:21.120
By now we have crossed
the 400-ppm mark -
00:58:21.240 --> 00:58:26.680
- and we're going up about 2.5 ppm a year.
There's no question that we're doing that.
00:58:33.280 --> 00:58:35.840
The pain comes after it's long too late -
00:58:35.960 --> 00:58:39.560
- because there is
a 50-60-year, if not longer, delay -
00:58:39.680 --> 00:58:42.720
- between having put
the greenhouse gases in their place -
00:58:42.840 --> 00:58:44.920
- and the oceans having warmed up.
00:58:45.040 --> 00:58:48.840
Just because we got to 400 ppm,
we haven't experienced that world yet.
00:58:57.880 --> 00:59:03.440
Energy is a large player in the system. We
do need a brand-new energy infrastructure.
00:59:03.560 --> 00:59:07.000
I don't think we can just
fix the one we have.
00:59:07.120 --> 00:59:10.840
And it better be affordable,
clean, carbon-neutral -
00:59:10.960 --> 00:59:15.240
- and not introduce its own new set
of environmental issues.
00:59:15.360 --> 00:59:19.800
Solar energy is by far our best hope.
It's getting cheaper and cheaper -
00:59:19.920 --> 00:59:25.080
- but we must overcome the inherent
intermittency issues we have.
00:59:25.880 --> 00:59:28.120
With energy, we need to do three things.
00:59:28.240 --> 00:59:31.720
We need to extract it. That could be
a solar panel, that could be a coal mine.
00:59:31.840 --> 00:59:35.760
We need to convert it into a useful form.
00:59:35.880 --> 00:59:41.040
And we have to clean up after ourselves.
I argue these are roughly three thirds.
00:59:41.160 --> 00:59:44.960
So far, we've taken the point of view that
we can just dump it in the atmosphere.
00:59:45.080 --> 00:59:47.560
The atmosphere is big and can handle it.
00:59:47.680 --> 00:59:50.680
And then we found out
the atmosphere is not big enough.
00:59:50.800 --> 00:59:55.480
This is the same story
we had with sewage in the 19th century.
00:59:55.600 --> 00:59:59.840
People said, "Well, a little bit of slop
hasn't hurt anybody yet." Well...
01:00:04.880 --> 01:00:11.080
We need to stop at 450 ppm,
which is the goal set by the IPCC.
01:00:11.200 --> 01:00:15.640
You can think of this as a translation
of the 2-degree warming.
01:00:21.560 --> 01:00:27.440
At current trends, we are about
17 years away from hitting 450 ppm.
01:00:27.560 --> 01:00:33.640
Now, we say it would be nice if we could
reduce and recycle it, but if we fail -
01:00:33.760 --> 01:00:38.480
- dumping it in the atmosphere is
acceptable, and we have to stop that.
01:00:40.240 --> 01:00:43.960
This is a key point. To stabilize at any
temperature, we have to get to net zero.
01:00:44.080 --> 01:00:46.960
And that's what got me started.
When I started...
01:00:47.080 --> 01:00:51.040
This was in the early '90s and climate
models weren't all that good yet.
01:00:51.160 --> 01:00:57.000
I said, "I don't really care whether
the number is 350, 450, 550 or even 750."
01:00:57.120 --> 01:00:59.560
We need to stop.
01:01:02.280 --> 01:01:06.120
And the change I have right now is
we're already in an overshoot -
01:01:06.240 --> 01:01:10.520
- and we will have to bend the curve
back down at the end.
01:01:10.640 --> 01:01:13.760
I think that overshoot is about 100 ppm.
01:01:13.880 --> 01:01:19.280
That is more CO2 than was
emitted in the 20th century.
01:01:21.720 --> 01:01:24.080
So even if I could stop overnight -
01:01:24.200 --> 01:01:27.560
- you have a very serious
residual carbon dioxide problem.
01:01:27.680 --> 01:01:31.640
The fact that I cannot stop overnight
will make it even harder.
01:01:31.760 --> 01:01:35.120
We have an enormous
sequestration effort in front of us -
01:01:35.240 --> 01:01:38.840
- and it's not a question of whether
we like carbon sequestration.
01:01:38.960 --> 01:01:42.960
We gave up on that option
in the 1980s and '90s -
01:01:43.080 --> 01:01:46.440
- because we decided
not to do anything.
01:01:48.800 --> 01:01:51.640
We have littered the atmosphere
with so much carbon -
01:01:51.760 --> 01:01:55.240
- that we have to clean up
after what we already did.
01:01:55.360 --> 01:02:00.440
One central technology to make this work,
not the only one, but one -
01:02:00.560 --> 01:02:04.120
- is the ability to pull CO2
back out of the air.
01:02:04.240 --> 01:02:08.320
We are grossly underestimating what
it will take to stabilize these things.
01:02:08.440 --> 01:02:13.440
If people say we need to have 3-4 gigatons
of negative emissions per year -
01:02:13.560 --> 01:02:16.680
- that's wishful thinking.
01:02:16.800 --> 01:02:18.960
Okay.
01:02:27.480 --> 01:02:32.000
You can ask: "How could I get the carbon
dioxide back out of the environment?"
01:02:32.120 --> 01:02:35.840
You have three options.
You either get it from the biomass.
01:02:35.960 --> 01:02:41.520
You're growing things and photosynthesis
picks CO2 out of the air. And it works.
01:02:41.640 --> 01:02:44.480
But you'd have to dedicate more area -
01:02:44.600 --> 01:02:48.920
- than we currently have dedicated
to agriculture. That sets the scale.
01:02:49.040 --> 01:02:54.240
I'm not telling you we shouldn't do that.
In a portfolio, it will play a part of it.
01:02:54.360 --> 01:02:59.520
Secondly, you could collect CO2 from the
ocean. There are some fundamental issues.
01:02:59.640 --> 01:03:04.840
You are hidden behind 25,000 molecules
of water for every molecule of CO2 -
01:03:04.960 --> 01:03:09.240
- whereas in the air,
the 400 ppm translates into 1 in 2,500.
01:03:09.360 --> 01:03:14.440
I think from a technological perspective,
your best bet is the atmosphere.
01:03:17.240 --> 01:03:22.040
Yes, we know we can do that because
in a submarine, people do this routinely.
01:03:22.160 --> 01:03:25.560
In a spacecraft,
people will do that routinely.
01:03:25.680 --> 01:03:31.640
The challenge is not can we do it, but can
we do it at a price the world can afford?
01:03:33.200 --> 01:03:37.680
If we keep it at $500 a ton,
that's an economic collapse.
01:03:37.800 --> 01:03:40.720
If we can get it to $50 a ton -
01:03:40.840 --> 01:03:44.080
- the world will hardly notice
that it had to fix it.
01:03:44.200 --> 01:03:47.720
If we can get down to $100 a ton,
we start looking interesting -
01:03:47.840 --> 01:03:51.320
- and I think we are
within striking distance.
01:03:51.440 --> 01:03:54.480
Any technology people have
is sorbent-based.
01:03:54.600 --> 01:03:56.800
What's sorbent?
01:03:56.920 --> 01:04:01.080
You have something which binds the CO2
as it flows over it.
01:04:01.200 --> 01:04:03.640
Our absorbent has
the remarkable features -
01:04:03.760 --> 01:04:07.280
- that when it's dry, it loves CO2.
When it's wet, it doesn't.
01:04:07.400 --> 01:04:10.240
This thing has air blowing through it.
01:04:10.360 --> 01:04:15.360
With our current design, in
about 20 minutes, it's soaked. It's full.
01:04:15.480 --> 01:04:20.760
It now collapses into this box.
We spray water in.
01:04:20.880 --> 01:04:24.680
Now it's moist and the air inside
will enrich itself with CO2.
01:04:24.800 --> 01:04:27.720
Then I have pumps
and compressors taking it out.
01:04:30.920 --> 01:04:34.840
The recapturing process figure
is roughly 1/3 extra.
01:04:34.960 --> 01:04:36.720
- A what?
- 1/3 extra.
01:04:36.840 --> 01:04:39.360
Your energy consumption is 1/3 larger -
01:04:39.480 --> 01:04:42.560
- than if you had ignored the problem
in the first place.
01:04:42.680 --> 01:04:48.680
You're saying that
it'll cost us 33% extra energy -
01:04:48.800 --> 01:04:52.280
- to have a system
where you take care of the carbon?
01:04:52.400 --> 01:04:57.320
Yes, if you want to clean up
after yourself, you will need energy.
01:04:58.760 --> 01:05:03.520
What's it look like when you get
some carbon? Is it a liquid?
01:05:03.640 --> 01:05:08.480
It's CO2 gas. You can compress it
to 100 atmospheres, then it's a liquid.
01:05:08.600 --> 01:05:12.400
So you get CO2 gas,
you compress it to a liquid.
01:05:12.520 --> 01:05:16.960
- Then you stick that liquid somewhere.
- You pipeline that to an oil well.
01:05:17.080 --> 01:05:21.600
- You can't convert it to a solid?
- Yes. That's mineral carbonate.
01:05:21.720 --> 01:05:24.240
We could be in Morocco
where you have basalt -
01:05:24.360 --> 01:05:28.280
- and we could inject the CO2
into the basalt formations -
01:05:28.400 --> 01:05:34.080
- and permanently form solid carbonates
underground, which put the CO2 away.
01:05:36.280 --> 01:05:42.240
You need 60 m2 of frontal area of these.
Think of these things, this wide -
01:05:42.360 --> 01:05:45.440
- this tall, when they are closed,
about 8 m tall.
01:05:45.560 --> 01:05:49.600
So you need about a dozen of them
to fill into a shipping container.
01:05:49.720 --> 01:05:52.920
That would give you a ton of CO2 a day.
01:05:54.720 --> 01:05:57.160
And we want 36 gigatons a year.
01:05:57.280 --> 01:06:02.200
Now, you would need 100 million shipping
containers to knock the CO2 back off.
01:06:03.760 --> 01:06:07.160
If you look at a car engine,
it costs about $10 a kilowatt.
01:06:07.280 --> 01:06:10.280
A power plant costs
$1,000 a kilowatt.
01:06:10.400 --> 01:06:14.680
So the mass production of the car engine,
we argue, is what makes it so cheap.
01:06:14.800 --> 01:06:20.000
So I see us not building
very huge collectors, but instead -
01:06:20.120 --> 01:06:23.680
- building collectors
on the ton-a-day scale.
01:06:28.320 --> 01:06:31.000
We build 80 million cars
and trucks a year -
01:06:31.120 --> 01:06:35.680
- and Shanghai Harbour sends out
30 million filled containers a year.
01:06:35.800 --> 01:06:39.800
So clearly, industrial production
can reach these scales -
01:06:39.920 --> 01:06:43.000
- without getting
completely bent out of shape.
01:06:43.120 --> 01:06:47.920
So I'm arguing this is
a very large scale, no doubt about it -
01:06:48.040 --> 01:06:53.440
- but it's also a very large problem, so
you have to operate at a very large scale.
01:06:53.560 --> 01:06:56.600
On the other hand,
the scale is not so prohibitively large -
01:06:56.720 --> 01:07:00.120
- that you would throw up your arms and
say, we've never done anything like that.
01:07:00.240 --> 01:07:03.400
Yes, we are doing things
on this scale routinely.
01:07:03.520 --> 01:07:04.880
One more point.
01:07:05.000 --> 01:07:10.440
Once you have it, it will take
40 years to pull things back down.
01:07:12.680 --> 01:07:15.840
We have to come
to a societal agreement -
01:07:15.960 --> 01:07:20.440
- that if you want to use
a ton of fossil fuel carbon -
01:07:20.560 --> 01:07:24.120
- you'd better put a ton away,
and it is certified that you did that.
01:07:24.240 --> 01:07:27.760
Governments could buy
these certificates of sequestration -
01:07:27.880 --> 01:07:30.040
- off the market and void them.
01:07:30.160 --> 01:07:34.600
The moment they did that,
you have a true negative emission.
01:07:39.400 --> 01:07:44.480
Today, the polluter is the one
who has control over fixing the problem.
01:07:44.600 --> 01:07:49.240
If you have air capture, they lost
that control. You can now say:
01:07:49.360 --> 01:07:53.800
"If you don't want to fix it, we'll fix it
for you and we'll send you the bill."
01:07:53.920 --> 01:07:56.920
You'd be surprised how fast that
power plant operator would figure out -
01:07:57.040 --> 01:08:01.840
- that they internally themselves
can fix this problem much faster.
01:08:01.960 --> 01:08:05.840
So I would argue
we're actually not slowing down action.
01:08:05.960 --> 01:08:11.840
By democratizing the process of doing it,
taking it away from the polluters -
01:08:11.960 --> 01:08:16.760
- we actually will accelerate
the process rather significantly.
01:09:09.880 --> 01:09:13.320
One useful way to deal
with a lot of these things is -
01:09:13.440 --> 01:09:18.080
- pricing externalities back
into the deal, and for example -
01:09:18.200 --> 01:09:23.000
- carbon emission rules
paying for education through taxes.
01:09:23.120 --> 01:09:26.200
It's subsidizing care work -
01:09:26.320 --> 01:09:31.520
- and it's putting extra taxes on sugar
because of the health burdens it creates.
01:09:41.840 --> 01:09:48.240
I met the manager of Rite Aid,
nice, old gent, a few years ago.
01:09:48.360 --> 01:09:50.160
So I asked this guy:
01:09:50.280 --> 01:09:55.640
"Ever considered not selling cigarettes
since you're a healthcare company?"
01:09:55.760 --> 01:09:59.640
And he said, "Young lady..."
01:09:59.760 --> 01:10:03.560
Yes. Thank you.
That was really nice, wasn't it?
01:10:03.680 --> 01:10:09.200
He said, "You and your friends need to get
the balls to make the government stop us -
01:10:09.320 --> 01:10:12.720
- because we are not going
to do it unilaterally."
01:10:12.840 --> 01:10:17.200
In many cases, corporations
would be happy to be regulated -
01:10:17.320 --> 01:10:20.760
- as long as you regulate the competition.
01:10:22.720 --> 01:10:27.520
I got to hear Paul Polman, CEO of
Unilever, and the first thing he said is:
01:10:27.640 --> 01:10:33.280
"I am deeply committed for the sake
of my grandchildren to sustainability."
01:10:33.400 --> 01:10:38.960
I promise you, there are lots and lots of
CEOs of very, very large corporations -
01:10:39.080 --> 01:10:42.840
- who can't wait to be regulated,
to be forced -
01:10:42.960 --> 01:10:47.560
- by a changing regulatory environment
into more sustainable policies.
01:10:47.680 --> 01:10:51.320
Don't expect them to go public on this -
01:10:51.440 --> 01:10:55.760
- because they have to cope with
their own boards and shareholders as well.
01:10:55.880 --> 01:10:58.440
But they are waiting for that moment.
01:11:00.960 --> 01:11:06.880
Corporations are so multinational, they're
gluing parts of the Earth together.
01:11:07.000 --> 01:11:12.160
The problem is regulating them, because
regulations now are largely a local thing.
01:11:12.280 --> 01:11:15.040
On the other hand, there is
some power of governments to say:
01:11:15.160 --> 01:11:18.840
"If you want to do business in my country,
you've got to obey this standard."
01:11:18.960 --> 01:11:24.160
Actually, it's kind of worse than that,
because the countries usually say:
01:11:24.280 --> 01:11:27.480
"Come to our country and hire some people,
and we'll lower your taxes."
01:11:27.600 --> 01:11:34.800
With the European Commission, we tried the
last few years to get tax laws unified -
01:11:34.920 --> 01:11:40.040
- and of course,
Luxembourg and Ireland opposed.
01:11:40.160 --> 01:11:44.080
So only some lame language
could actually be passed.
01:11:44.200 --> 01:11:48.880
I've come to a radical position that
the corporate tax system is unfixable.
01:11:50.160 --> 01:11:55.000
Corporations will always have more
lobbying power than government can resist.
01:11:55.120 --> 01:11:59.160
Instead we should focus on
actually taxing capital quite directly -
01:11:59.280 --> 01:12:05.520
- because all corporations are owned
generally by the 1% or even the 0.1%.
01:12:05.640 --> 01:12:08.480
We know who they are.
We know where they live.
01:12:08.600 --> 01:12:13.240
We've seen governments figuring out
where their assets are, if they want to.
01:12:13.360 --> 01:12:16.240
Almost all of our tax burden is on labor -
01:12:16.360 --> 01:12:19.360
- and in this technological world,
that does not make sense.
01:12:19.480 --> 01:12:23.120
So actually taxing the largely
dead capital of 100,000 people -
01:12:23.240 --> 01:12:27.400
- sitting in the Cayman Islands
or wherever is much simpler -
01:12:27.520 --> 01:12:31.040
- than trying to sort out
the international shell games -
01:12:31.160 --> 01:12:34.200
- that Apple
or all these other companies play.
01:12:34.320 --> 01:12:38.480
It creates incentive to use the capital.
Otherwise it costs you money to hold it.
01:12:38.600 --> 01:12:41.520
Exactly. And it never touches
the real economy.
01:12:41.640 --> 01:12:45.320
5% of GDP.
5% of GDP.
01:12:45.440 --> 01:12:50.840
When you map it out, you see
these loops of money sloshing around -
01:12:50.960 --> 01:12:55.040
- that never touch the real economy,
that are enriching people in the top 1%.
01:12:55.160 --> 01:12:58.720
The people who take the little bit off
the top of no-value creation.
01:12:58.840 --> 01:13:02.920
Take a little off the top
of a big river, it's a lot.
01:13:54.560 --> 01:13:58.400
Inequality has increased
dramatically since the '70s -
01:13:58.520 --> 01:14:03.920
- so the wealth that's been created
has not been shared significantly -
01:14:04.040 --> 01:14:08.560
- because of ideologies like the one
that's currently in power -
01:14:08.680 --> 01:14:12.800
- that if you cut taxes for the rich,
they'll make more stuff -
01:14:12.920 --> 01:14:16.960
- and then it'll all trickle down
to the poor. But that hasn't happened.
01:14:35.160 --> 01:14:41.040
The way in which modern economies
have characterized growth in the West -
01:14:41.160 --> 01:14:46.400
- or globally since the '40s was GDP,
which is a measure of output...
01:14:46.520 --> 01:14:51.840
How many cars we make and all this stuff.
It's not a horrible proxy for growth.
01:14:51.960 --> 01:14:55.160
But as the economy has evolved,
it's gotten worse and worse.
01:14:55.280 --> 01:14:58.400
There are a bunch of
fundamental problems with it.
01:14:58.520 --> 01:15:04.720
One of the fundamental problems of it is
that it does not account for inequality.
01:15:07.320 --> 01:15:09.800
It is possible to have GDP rise -
01:15:09.920 --> 01:15:13.720
- where a few people at the top are
the beneficiaries of all that growth -
01:15:13.840 --> 01:15:15.760
- and everybody else loses.
01:15:15.880 --> 01:15:19.320
So if you look at the GDP statistics,
it looks like things are going great.
01:15:19.440 --> 01:15:24.040
But if you examine the actual experience
of the typical person -
01:15:24.160 --> 01:15:28.560
- things are getting worse,
and this is happening in most of the West.
01:15:28.680 --> 01:15:30.880
The other thing about GDP is -
01:15:31.000 --> 01:15:36.240
- it doesn't capture any externalities
like pollution and stuff like that.
01:15:36.360 --> 01:15:43.000
And it characterizes outputs
but not outcomes in terms of welfare.
01:15:45.960 --> 01:15:49.120
There's been a lot of good research
about what people care about.
01:15:49.240 --> 01:15:54.240
Do they actually care about inequality?
Does it matter to anybody but economists?
01:15:54.360 --> 01:15:59.680
The answer is they do have
a set of values around meritocracy.
01:15:59.800 --> 01:16:03.600
People get what they deserve,
a fairness of process -
01:16:03.720 --> 01:16:08.720
- that the process itself leads
to fair outcomes.
01:16:08.840 --> 01:16:14.440
Inequality can be a signal of an unfair
process and also of a fair process.
01:16:14.560 --> 01:16:17.280
Bill Gates is really rich.
Are you disturbed by that?
01:16:17.400 --> 01:16:21.880
Most people would say he did something
useful in a meritocratic process.
01:16:22.000 --> 01:16:25.840
Is some shady hedge fund manager
who made a lot of money during the crisis?
01:16:25.960 --> 01:16:28.040
Well, probably not.
01:16:28.160 --> 01:16:33.360
It's important to ground these discussions
in stuff and values people care about.
01:16:33.480 --> 01:16:38.160
It then bridges into: How do you
construct a new social contract -
01:16:38.280 --> 01:16:41.880
- that people would broadly view
as fair and ethical?
01:16:43.520 --> 01:16:48.600
For the Americans in the room
it's worth just reflecting -
01:16:48.720 --> 01:16:52.160
- that if you care about reproductive
rights, there's a brand for you.
01:16:52.280 --> 01:16:55.320
If you care about animal rights,
there's a brand for you.
01:16:55.440 --> 01:16:58.040
If you care about civil rights,
there's a brand for you.
01:16:58.160 --> 01:17:01.560
If you care about the rights
of old people, there's a brand for you.
01:17:01.680 --> 01:17:06.120
If you care about the baby seals
or the whales or, you know, you pick it -
01:17:06.240 --> 01:17:09.520
- there's a go-to organization
that represents your interests.
01:17:09.640 --> 01:17:11.680
Gun rights, everything.
01:17:11.800 --> 01:17:16.880
When Trump's administration pushed through
a $1.5-trillion tax cut for the rich -
01:17:17.000 --> 01:17:20.200
- who did Americans turn to? No one.
01:17:20.920 --> 01:17:24.000
No one.
There was no organizing to counter that.
01:17:24.120 --> 01:17:26.280
And that's why it got pushed through.
01:17:27.640 --> 01:17:30.960
The median family in America earns
$59,000 a year.
01:17:31.080 --> 01:17:37.480
If they'd been held harmless by inequality
since 1980, they'd earn $86,000.
01:17:37.600 --> 01:17:40.640
If they'd fully participated
in productivity growth -
01:17:40.760 --> 01:17:43.520
- they would earn $101,000.
01:17:43.640 --> 01:17:47.920
Let me underscore this point. The nation
owes the median family a raise -
01:17:48.040 --> 01:17:52.280
- of $25,000-40,000 a year.
It's a lot of money.
01:17:53.920 --> 01:17:56.360
So in very conservative terms -
01:17:56.480 --> 01:18:02.240
- $2 trillion per year that used to flow
to wages for ordinary Americans -
01:18:02.360 --> 01:18:07.840
- now basically has been
captured by economic elites.
01:18:09.160 --> 01:18:12.520
That gap is why people are so pissed off.
01:18:20.960 --> 01:18:25.600
In the '70s, 65% of American workers
were entitled to overtime.
01:18:25.720 --> 01:18:29.000
Today, it's 7% -
01:18:29.120 --> 01:18:32.520
- because if you're paid more
than $23,600 a year -
01:18:32.640 --> 01:18:36.680
- they can throw you
a fake title like assistant manager -
01:18:36.800 --> 01:18:40.840
- and make you work 78 hours a week
and not pay you overtime.
01:18:40.960 --> 01:18:45.160
Forget your family life. Forget
raising your kids. Forget all that.
01:18:45.280 --> 01:18:49.880
That part of a dignified life goes down
the tubes. But the most insidious thing -
01:18:50.000 --> 01:18:55.640
- and the thing that some of us
in this room built a career on -
01:18:55.760 --> 01:19:01.560
- is getting two people to do the job of
three by making them work 60 hours a week.
01:19:01.680 --> 01:19:03.960
Something I did my whole career.
01:19:04.080 --> 01:19:07.720
If you do that 60 million times
across an economy -
01:19:07.840 --> 01:19:10.840
- you pull 20 million jobs
out of the economy.
01:19:13.480 --> 01:19:18.800
I spent 20 years at McKinsey
doing that stuff, and it really worked.
01:19:18.920 --> 01:19:22.000
We really changed the way companies ran.
01:19:22.120 --> 01:19:25.960
A disturbing aspect of what's
happened over the last four decades -
01:19:26.080 --> 01:19:30.400
- has been the gutting of middle-class,
again, particularly in the US -
01:19:30.520 --> 01:19:35.520
- from an economic standpoint, but also
from a political power standpoint.
01:19:35.640 --> 01:19:39.880
We've seen growing dysfunction in
our democracy and on representativeness.
01:19:40.000 --> 01:19:45.440
And so we've been excluding people
both economically and politically.
01:19:45.560 --> 01:19:48.560
That's a poisonous combination.
01:20:14.760 --> 01:20:19.640
So far, the way we learn economically
about economic organization -
01:20:19.760 --> 01:20:23.360
- is we do an experiment,
you know, we try out Marxism.
01:20:23.480 --> 01:20:26.360
That was a long, painful experiment.
01:20:34.280 --> 01:20:40.000
Here, we tried Milton Friedman-style
free-market capitalism and deregulation -
01:20:40.120 --> 01:20:45.120
- and the 2008 crisis is a clear lesson
that that doesn't work.
01:20:45.240 --> 01:20:49.320
Unfortunately,
that lesson hasn't been absorbed.
01:20:50.600 --> 01:20:54.560
These trends are true in every developing
country but to differing degrees.
01:20:54.680 --> 01:20:59.320
- Inequality is rising in every country.
-There's a bit of nuance to this.
01:20:59.440 --> 01:21:03.920
The US is the psycho case
and Britain is a close second.
01:21:04.040 --> 01:21:07.720
There's a spectrum where some countries
have experienced rising inequality -
01:21:07.840 --> 01:21:11.200
- others stagnant wages,
others declining social mobility.
01:21:11.320 --> 01:21:15.440
But only the US has experienced all three.
There's a strong correlation between -
01:21:15.560 --> 01:21:20.760
- the strength of these effects and how
much they drank the neoliberal Kool-Aid.
01:21:20.880 --> 01:21:25.840
There's no question that the embrace of
Milton, what we now call neoliberalism...
01:21:25.960 --> 01:21:31.120
- Although they hate that phrase,
- What do they call it? Just free markets?
01:21:31.240 --> 01:21:37.360
They just call it economics. The law
and economics movement in law schools.
01:21:50.680 --> 01:21:53.560
Neoliberalism is the story
we've told ourselves -
01:21:53.680 --> 01:21:58.320
- which has enabled a small group
of people to capture all the progress.
01:21:58.440 --> 01:22:01.880
And the story is relatively new -
01:22:02.000 --> 01:22:05.480
- 30 or 40 years old,
and the story contains bits -
01:22:05.600 --> 01:22:09.000
- like the only purpose of the corporation
is to enrich shareholders -
01:22:09.120 --> 01:22:15.760
- which today is accepted
by really smart people that I know.
01:22:15.880 --> 01:22:20.000
As... you know,
as true in the same way that -
01:22:20.120 --> 01:22:25.280
- force equals mass times acceleration is
considered true. And it was just made up.
01:22:25.400 --> 01:22:29.920
It's a moral law about how we arrange
the status and privileges in a society.
01:22:30.040 --> 01:22:35.000
So killing that idea and replacing it
with something better is essential -
01:22:35.120 --> 01:22:39.440
- if you're going to have
the kind of society you want to have.
01:22:46.400 --> 01:22:51.200
Capitalism enables people
to have private ownership of things.
01:22:51.320 --> 01:22:55.560
But it doesn't imply that there should be
no rules governing those arrangements -
01:22:55.680 --> 01:22:59.600
- and that the rich should
be able to exploit the poor.
01:22:59.720 --> 01:23:05.360
And that arrangement, we characterize
as neoliberalism, is essentially a way -
01:23:05.480 --> 01:23:12.120
- that consolidates power and wealth
at the top while excluding other people.
01:23:12.240 --> 01:23:18.880
And that's a political arrangement. That's
not a necessity of the economic system.
01:23:19.000 --> 01:23:22.000
Neoliberalism is a movement
about ideas -
01:23:22.120 --> 01:23:26.000
- and it's not directly focused
on immediate social action.
01:23:26.120 --> 01:23:31.800
It's a belief that if we can change ideas,
we can change the world.
01:23:33.760 --> 01:23:38.520
Ideas are kind of like the operating
system of our politics and economy.
01:23:38.640 --> 01:23:42.680
When you use your computer,
you're not aware of the operating system -
01:23:42.800 --> 01:23:46.320
- but it's back there shaping
and controlling everything.
01:23:46.440 --> 01:23:49.640
Our intellectual frameworks
play a similar role.
01:23:49.760 --> 01:23:53.040
The ideas that we had
based off of neoclassical economics -
01:23:53.160 --> 01:23:58.240
- and neoliberal political ideology
really took off in the '70s and '80s.
01:23:58.360 --> 01:24:04.520
They caused politicians and
business leaders to think differently -
01:24:04.640 --> 01:24:09.560
- about how the economy worked and that
caused them to take different actions.
01:24:09.680 --> 01:24:11.720
The neoliberal paradigm is sophisticated -
01:24:11.840 --> 01:24:15.120
- but you can boil it down
to three core propositions.
01:24:15.240 --> 01:24:19.840
One, that the best way to conceptualize
society is atomized individuals -
01:24:19.960 --> 01:24:22.920
- competing rationally
to maximize their own utility.
01:24:23.040 --> 01:24:25.320
At the other end of the spectrum,
you believe that -
01:24:25.440 --> 01:24:29.080
- the best measure of success for
your society is wealth measured by GDP.
01:24:29.200 --> 01:24:32.920
Believing those two things,
you have third, a program -
01:24:33.040 --> 01:24:37.200
- which is the proper role for government,
essentially to encase markets.
01:24:37.320 --> 01:24:40.160
Because free markets are the best way
for atomized individuals -
01:24:40.280 --> 01:24:44.160
- competing to maximize
their self-interest to generate wealth.
01:24:44.280 --> 01:24:47.360
The neoliberal paradigm had
a profound effect in shaping -
01:24:47.480 --> 01:24:52.000
- what we did all across the board,
whether it's trade policy -
01:24:52.120 --> 01:24:57.080
- industrial policy, labor policy, welfare
policy. Colonized field after field.
01:24:57.200 --> 01:25:00.040
The law schools are completely captured
by law and economics.
01:25:00.160 --> 01:25:04.960
So, the neoliberals reshaped
the entire globe.
01:25:09.240 --> 01:25:14.000
We're at a moment today much like the
post-Great Depression and the early 1970s -
01:25:14.120 --> 01:25:19.520
- in the sense that we're dealing with a
world which feels like it's coming apart -
01:25:19.640 --> 01:25:24.160
- for reasons that are related to,
though not all exclusively attributable -
01:25:24.280 --> 01:25:28.000
- to the product of this set of ideas.
01:25:28.120 --> 01:25:33.120
Is this because the paradigm is going out
of date or because the world is changing?
01:25:33.240 --> 01:25:34.560
I think it's both.
01:25:34.680 --> 01:25:39.280
So one of the tendencies of a political
economy that's focused around markets -
01:25:39.400 --> 01:25:42.480
- is to generate wealth inequality.
01:25:42.600 --> 01:25:45.680
In the '70s and '80s,
that was not that big a deal -
01:25:45.800 --> 01:25:48.000
- because we were at a historic low
for wealth inequality.
01:25:48.120 --> 01:25:52.720
It didn't happen precipitously fast and it
seemed to be solving problems for people.
01:25:52.840 --> 01:25:55.160
But by today, it's a big problem.
01:25:55.280 --> 01:25:59.200
People are really aware of it. It's a
generator of enormous social tensions -
01:25:59.320 --> 01:26:02.080
- all across the way
and through different lenses.
01:26:02.200 --> 01:26:05.440
The neoliberal paradigm doesn't have great
answers to it. Even the progressive side.
01:26:05.560 --> 01:26:12.600
The global free trade regime that was
accelerated by it has produced benefits -
01:26:12.720 --> 01:26:15.840
- but also a set of tensions they don't
have answers to within that paradigm.
01:26:15.960 --> 01:26:20.400
And then the biggest one, which is the
coming wave of AI automation technology -
01:26:20.520 --> 01:26:24.760
- is just upending the workplace, and
the neoliberal paradigm is essentially:
01:26:24.880 --> 01:26:29.560
"We're just going to allow this massive
further shift from labor to capital."
01:26:29.680 --> 01:26:34.680
There's no good answers
within this paradigm.
01:26:38.480 --> 01:26:41.520
A key part of this is changing education.
01:26:41.640 --> 01:26:44.920
The next generation of leaders need be
better taught than those we have now.
01:26:45.040 --> 01:26:48.080
They need to not be taught
this neoliberal paradigm.
01:26:48.200 --> 01:26:51.600
The neoliberals did recognize this
because you're training the people -
01:26:51.720 --> 01:26:55.520
- that are going to teach others. You
didn't need to capture every university.
01:26:55.640 --> 01:27:00.200
They captured a few that trained graduate
students who then taught everywhere else.
01:27:00.320 --> 01:27:04.120
The students were the people who then
went out and became the policymakers -
01:27:04.240 --> 01:27:06.960
- and doing all of the work.
01:27:08.480 --> 01:27:13.760
A set of ideas came in and changed
the way the world worked.
01:27:13.880 --> 01:27:16.000
We would argue not for the better.
01:27:16.120 --> 01:27:21.120
Maybe a new set of ideas can come in and
change the world in a better direction.
01:28:30.880 --> 01:28:37.280
The trick is to find the happy medium.
A place where... a typical person -
01:28:37.400 --> 01:28:41.640
- working reasonably hard can lead
a dignified life and feel fairly treated.
01:28:41.760 --> 01:28:44.760
And not to make everybody rich
or everybody the same -
01:28:44.880 --> 01:28:48.400
- or to make getting rich illegal
or anything like that.
01:28:48.520 --> 01:28:54.120
You just want to create a society where
everyone... you'll never get to everyone.
01:28:54.240 --> 01:29:00.640
Where the majority feel like it's fair
and reasonable and high-functioning.
01:29:03.200 --> 01:29:08.640
A key part is constructing
a new intellectual paradigm -
01:29:08.760 --> 01:29:13.440
- and our hypothesis is that we actually
have many of the puzzle pieces for that -
01:29:13.560 --> 01:29:17.640
- lying around in the community
that we're all a part of.
01:29:17.760 --> 01:29:23.040
We have the potential to begin assembling
them into both a positive theory -
01:29:23.160 --> 01:29:27.000
- that better describes social systems
in a more empirically validated way -
01:29:27.120 --> 01:29:32.000
- but also leads to normative theories
about how we can make the systems better -
01:29:32.120 --> 01:29:36.320
- and serve the interests of people
more broadly, and at a conceptual level -
01:29:36.440 --> 01:29:41.840
- the environment needs to be the
wrapper that all this is happening in.
01:29:41.960 --> 01:29:48.760
So, do people see those puzzle pieces
out there? Are we missing some?
01:29:48.880 --> 01:29:53.080
Does that kind of framing help,
or do you think of it differently?
01:29:57.680 --> 01:30:01.040
The big vacuum I see is
that one-paragraph description.
01:30:01.160 --> 01:30:03.560
We can write down a clear
one-paragraph description -
01:30:03.680 --> 01:30:08.800
- of what Milton Friedman-style
neoliberalism is. What is the alternative?
01:30:08.920 --> 01:30:15.440
At the core of the neoliberal movement was
a set of foundational intellectual ideas -
01:30:15.560 --> 01:30:18.880
- but a key jump the neoliberals made
was not just -
01:30:19.000 --> 01:30:22.920
- translation into understandable language
but specifically to moral language.
01:30:23.040 --> 01:30:26.720
If you read the Mont Pelerin Statement,
it is a call to arms.
01:30:26.840 --> 01:30:31.960
It is a set of moral statements
about freedom and opportunity and so on.
01:30:32.080 --> 01:30:36.400
We need to do a bit of shifting of...
We have a core of intellectual ideas -
01:30:36.520 --> 01:30:41.000
- but how do we connect it
into that larger moral narrative?
01:30:41.120 --> 01:30:43.800
It's important to identify the core idea.
01:30:43.920 --> 01:30:47.720
I think the core idea is
adaptive complex systems.
01:30:47.840 --> 01:30:53.320
I see that as the key mechanism that might
both have some grounding in reality -
01:30:53.440 --> 01:30:55.680
- and might appeal to people
and resonate.
01:30:58.880 --> 01:31:03.040
Mainstream economics hasn't bought into
the toolkit they would have to master -
01:31:03.160 --> 01:31:07.000
- in order to be able
to model what's going on.
01:31:07.120 --> 01:31:09.440
Instead of modeling the world -
01:31:09.560 --> 01:31:13.480
- with these very stylized dynamic
stochastic general equilibrium models...
01:31:13.600 --> 01:31:15.880
That's what the models
they use now are called.
01:31:16.000 --> 01:31:21.080
... let's simulate an artificial economy
and make it look like the real economy.
01:31:21.200 --> 01:31:25.400
It's a complete paradigm shift
in the way you do the science.
01:31:34.640 --> 01:31:40.240
The approach is to create artificial
economies that mimic the real economy -
01:31:40.360 --> 01:31:43.600
- closely enough that we can do
experiments in the artificial economy -
01:31:43.720 --> 01:31:48.080
- to see what would happen
if we tried out new ideas.
01:31:49.120 --> 01:31:54.480
I've looked at alternative perspectives
from a complex systems perspective -
01:31:54.600 --> 01:31:59.880
- that the economy's a dynamic system,
constantly evolving and innovating -
01:32:00.000 --> 01:32:04.360
- and that people are not rational
Spock-like robots but actually people.
01:32:11.240 --> 01:32:14.960
Inequality is where the shit hit the fan,
so we should build on that.
01:32:15.080 --> 01:32:18.960
One thing that I think is an essential
part of the complex systems narrative is -
01:32:19.080 --> 01:32:23.760
- that you build models where you actually
respect individuals in the model.
01:32:23.880 --> 01:32:27.760
So you build in the ability to think
about problems like inequality -
01:32:27.880 --> 01:32:33.080
- and those low-level interactions
and the way in which people is linked -
01:32:33.200 --> 01:32:36.360
- to the emergent phenomenon
of the macroeconomy -
01:32:36.480 --> 01:32:39.920
- and the political system
and democracy and everything else.
01:32:40.040 --> 01:32:45.360
You build that into the whole conceptual
framework from the get-go.
01:32:49.760 --> 01:32:55.160
We can create the new economy without
doing every experiment the hard way.
01:32:55.280 --> 01:32:58.200
But if we want a good simulation
of the economy, we need -
01:32:58.320 --> 01:33:03.320
- large-scale, big science teamwork,
and we could do it.
01:33:03.440 --> 01:33:07.240
We have the skills and the knowledge,
but we're not doing it.
01:33:32.480 --> 01:33:35.280
Earth is the center of the solar system.
01:33:35.400 --> 01:33:38.400
Galileo didn't invent the telescope,
he popularized it. He's like:
01:33:38.520 --> 01:33:41.560
"You can see it with your own eyes."
01:33:41.680 --> 01:33:46.440
The Catholic church said: "Stick that
telescope where the sun doesn't shine."
01:33:46.560 --> 01:33:50.680
And they stuck him in jail
for the rest of his life. Why?
01:33:50.800 --> 01:33:54.960
Because if Earth was diminished,
so were they.
01:33:55.080 --> 01:34:01.840
And the only fact they cared about was
their status, their privileges and power.
01:34:02.760 --> 01:34:06.040
Neoliberalism is exactly the same thing.
01:34:06.160 --> 01:34:09.960
It is an embedded ideology that
benefits a small group of people -
01:34:10.080 --> 01:34:13.720
- and they're going to hang on to it
with their fingernails.
01:34:19.440 --> 01:34:24.120
Part of fixing these problems is a new
set of ideas, a new thought system -
01:34:24.240 --> 01:34:28.440
- a new set of means
that are more grounded in reality.
01:34:28.560 --> 01:34:33.240
The behavioral model of the neoclassical
economics and neoliberals was made up.
01:34:33.360 --> 01:34:36.760
Nobody studied human beings.
These were philosophers -
01:34:36.880 --> 01:34:40.600
- who made up ideas of how humans behave
that turned out to not be true.
01:34:40.720 --> 01:34:43.800
We'd rather have a system of thought
based on the kind of work -
01:34:43.920 --> 01:34:47.680
- that someone like Molly is doing;
how humans really behave.
01:34:51.280 --> 01:34:54.560
One big revolution in
understanding human behavior -
01:34:54.680 --> 01:34:58.200
- has been understanding humans
as a social species.
01:34:58.320 --> 01:35:03.800
We come equipped with a set of instincts
and behaviors to facilitate cooperation.
01:35:03.920 --> 01:35:07.400
This paints a vastly different picture
from the neoliberal model -
01:35:07.520 --> 01:35:11.560
- which assumes we're all transactional,
self-interested individuals.
01:35:16.440 --> 01:35:18.240
We're incredibly social creatures -
01:35:18.360 --> 01:35:22.480
- and one of our primary points
of distinction is how much we cooperate.
01:35:22.600 --> 01:35:25.640
You can only solve trivial problems
by yourself.
01:35:25.760 --> 01:35:31.080
Any complex problem requires us
to team up and cooperate.
01:35:34.720 --> 01:35:37.720
We would like a new way
of thinking about prosperity -
01:35:37.840 --> 01:35:40.840
- that is linked to
our material standard of living -
01:35:40.960 --> 01:35:44.520
- but is not tied to notions
of market transactions -
01:35:44.640 --> 01:35:47.240
- but it gets at something fundamental.
01:35:47.360 --> 01:35:50.040
Our goal is to rip economics
down to the studs -
01:35:50.160 --> 01:35:55.120
- and rebuild it around a more accurate
characterization of what people are like -
01:35:55.240 --> 01:36:00.760
- how social systems behave, and a
new and better definition of prosperity -
01:36:00.880 --> 01:36:05.600
- that can align economic progress
and moral progress.
01:36:05.720 --> 01:36:11.160
The concept is that value comes from
what we call solutions to human problems.
01:36:14.040 --> 01:36:17.040
Our stock of solutions to human problems -
01:36:17.160 --> 01:36:21.040
- is a way of thinking
about the wealth of society.
01:36:24.320 --> 01:36:28.320
And growth is the rate at which
we're creating and making available -
01:36:28.440 --> 01:36:31.240
- new solutions to human problems.
01:36:42.600 --> 01:36:47.440
So if we came up with a new solution for
anti-gravity boots to help us get around -
01:36:47.560 --> 01:36:49.880
- but only Bill Gates could afford them -
01:36:50.000 --> 01:36:52.280
- we wouldn't say
society's prosperity has been increased.
01:36:52.400 --> 01:36:56.880
So there's a natural notion that things
have to become widely available to people.
01:36:57.000 --> 01:37:00.480
When antibiotics became very cheap
and everybody could get them -
01:37:00.600 --> 01:37:04.920
- that was a huge increase
in human material welfare.
01:37:05.880 --> 01:37:11.480
If prosperity in societies is accumulation
of solutions to human problems -
01:37:11.600 --> 01:37:16.880
- and not GDP or essentially how
much stuff we burn or move around -
01:37:17.000 --> 01:37:22.680
- and growth is the rate at which we
distribute solutions to human problems -
01:37:22.800 --> 01:37:26.760
- then the way in which we've understood
what capitalism is and how it works -
01:37:26.880 --> 01:37:29.720
- is probably deeply flawed.
01:37:36.800 --> 01:37:41.200
We think capitalism really
needs to be redefined -
01:37:41.320 --> 01:37:45.760
- as a system that is designed and
optimized to solve human problems -
01:37:45.880 --> 01:37:49.120
- and to distribute the solutions
as fast as possible.
01:37:49.240 --> 01:37:51.880
And we call that Market Humanism
to signify -
01:37:52.000 --> 01:37:57.360
- markets in the service of humanity.
Not the other way around.
01:37:58.120 --> 01:38:01.120
Eusocial animals are things
like bees and ants.
01:38:01.240 --> 01:38:04.360
Creatures where their very survival
depends on their cooperation.
01:38:04.480 --> 01:38:06.920
Humans are eusocial creatures.
01:38:07.040 --> 01:38:10.160
Most of us would not survive
without other human beings.
01:38:10.280 --> 01:38:12.400
We all are able to deeply specialize -
01:38:12.520 --> 01:38:16.360
- and then cooperate networks
to exchange our specialism -
01:38:16.480 --> 01:38:21.280
- to then tap into the larger societal
knowledge that we have to solve problems.
01:38:21.400 --> 01:38:24.600
Versus the neoliberal narrative,
which is all about atomistic competition.
01:38:24.720 --> 01:38:28.000
When you understand humans
as eusocial creatures -
01:38:28.120 --> 01:38:33.880
- we believe that you end up with what
we call the golden rule of economics -
01:38:34.000 --> 01:38:36.000
- which is the principle of inclusion.
01:38:39.000 --> 01:38:44.200
People have this sense that if
and when we have economic growth -
01:38:44.320 --> 01:38:48.840
- we should include other people
in that growth for moral reasons.
01:38:52.120 --> 01:38:55.440
What our research shows
quite conclusively is -
01:38:55.560 --> 01:39:01.520
- if you understand how economic growth
is really created in human societies -
01:39:01.640 --> 01:39:05.760
- you can see that that framework is
both wrong and backwards -
01:39:05.880 --> 01:39:10.400
- that economic inclusion
turns out to be the thing -
01:39:10.520 --> 01:39:16.240
- the technical mechanism that creates
economic growth in human societies.
01:39:17.440 --> 01:39:20.520
Inclusion creates growth.
01:39:22.720 --> 01:39:25.280
We need to help people
contribute and participate -
01:39:25.400 --> 01:39:28.720
- give them the skills, education,
infrastructure and capabilities -
01:39:28.840 --> 01:39:31.880
- and when they contribute,
they need a fair deal back.
01:39:32.000 --> 01:39:36.200
When you do that, the circle of trust
expands, participation expands -
01:39:36.320 --> 01:39:40.040
- you get more innovation,
more growth and more fairness.
01:39:42.480 --> 01:39:48.200
The economy is people, not money, and the
more people we include in that economy -
01:39:48.320 --> 01:39:53.160
- as workers, entrepreneurs,
inventors, consumers -
01:39:53.280 --> 01:39:57.400
- the bigger the economy grows
and the better it gets for everybody.
01:39:58.520 --> 01:39:59.840
What's the one meme -
01:39:59.960 --> 01:40:03.960
- if you could drive it into the system
up to the top political business levels -
01:40:04.080 --> 01:40:08.320
- it would be that inclusion
and cooperation create prosperity.
01:40:08.440 --> 01:40:11.920
If we can get the flipping thinking
to that -
01:40:12.040 --> 01:40:16.320
- that starts opening up a different
conversation and culture.
01:40:16.440 --> 01:40:19.920
Just to underscore how different
this is from a neoliberal framework -
01:40:20.040 --> 01:40:26.240
- in '72, Arthur Okun, a famous economist,
wrote The Big Tradeoff: Equality v. Growth.
01:40:26.360 --> 01:40:30.440
They put these two things
in opposition to each other.
01:40:30.560 --> 01:40:33.320
If you provide healthcare,
that might be justified -
01:40:33.440 --> 01:40:37.000
- for social justice or fairness reasons,
but there's no free lunch.
01:40:37.120 --> 01:40:40.280
"That's going to hurt
the economy and hurt growth."
01:40:40.400 --> 01:40:44.520
This big tradeoff set up our politics
for the decades after.
01:40:44.640 --> 01:40:50.080
The Democrats became the party of fairness
and the Republicans the party of growth.
01:40:50.200 --> 01:40:55.040
They fought around that big tradeoff, but
they both accepted that larger framework.
01:40:55.160 --> 01:40:59.760
And that contributed strongly
to the breakdown of the social contract -
01:40:59.880 --> 01:41:05.400
- and the dysfunction that we see. We see
that big tradeoff as just nonsense.
01:41:06.400 --> 01:41:08.880
It's hard to underscore enough -
01:41:09.000 --> 01:41:13.040
- how deeply that old idea about
the tradeoff between fairness and growth -
01:41:13.160 --> 01:41:17.000
- is embedded in our political
and economic culture.
01:41:17.120 --> 01:41:21.360
How deeply people who make laws
believe that.
01:41:21.480 --> 01:41:27.720
To break that idea and to replace it with
this other idea can be transformational.
01:41:39.480 --> 01:41:45.240
There's a ton of really powerful
and persuasive abstract theory -
01:41:45.360 --> 01:41:50.920
- that leads you to this conclusion,
and a shit ton of empirical evidence too -
01:41:51.040 --> 01:41:54.200
- that there's basically
no contradictory set of empirical data.
01:41:58.240 --> 01:42:03.240
With respect to policy, we sought
to have a list of fundamental things -
01:42:03.360 --> 01:42:07.440
- that you need to have
to drive these feedback loops.
01:42:07.560 --> 01:42:11.560
They start with things
like fairness and justice -
01:42:11.680 --> 01:42:14.800
- because you can't have
cooperation without them.
01:42:14.920 --> 01:42:18.480
The product of fairness
and justice is trust -
01:42:18.600 --> 01:42:22.600
- which is the glue that makes
cooperative networks go.
01:42:22.720 --> 01:42:26.640
Competition isn't the goal,
which is the conventional view.
01:42:26.760 --> 01:42:30.360
Once you see evolution is the point where
you want as many competitors as possible -
01:42:30.480 --> 01:42:32.880
- and then monopolies aren't right.
01:42:33.000 --> 01:42:37.560
And from that you end up
with a new narrative about inclusion -
01:42:37.680 --> 01:42:40.080
- and the central role
of the middle-class...
01:42:40.200 --> 01:42:44.880
That economies don't grow trickle down,
they grow actually from the middle out.
01:42:45.000 --> 01:42:48.560
If you look at the history of successful,
prosperous economies -
01:42:48.680 --> 01:42:51.080
- they all have large, vibrant,
diverse middle-classes -
01:42:51.200 --> 01:42:54.160
- because that's where the workers,
the entrepreneurs come from.
01:42:54.280 --> 01:42:57.600
We don't get more innovation
by giving Bill Gates a tax cut.
01:42:57.720 --> 01:43:03.800
If a bunch of kids grow up to be engineers
or entrepreneurs, then we will get some.
01:43:03.920 --> 01:43:08.040
- It's also where demand comes from.
- We've hollowed out the middle-class.
01:43:08.160 --> 01:43:12.320
It's sapped the energy out of demand,
slowed investment, hurt entrepreneurship.
01:43:12.440 --> 01:43:14.600
There's good things going on -
01:43:14.720 --> 01:43:18.160
- but we will be eating our seed corn
if we keep going this way.
01:43:19.840 --> 01:43:22.240
We see a positive role for business
in this framework -
01:43:22.360 --> 01:43:24.920
- because businesses properly run
are not in the business -
01:43:25.040 --> 01:43:28.480
- of making money for shareholders but
in the business of solving human problems.
01:43:28.600 --> 01:43:33.120
Some of the most long-lasting businesses
have thought of themselves in that way.
01:43:33.240 --> 01:43:37.920
The shareholder capitalism revolution,
which only happened starting in the '70s -
01:43:38.040 --> 01:43:43.440
- in some ways took the human purpose
and the soul out of our businesses.
01:43:48.080 --> 01:43:53.280
Defining prosperity and growth
in terms of improving welfare -
01:43:53.400 --> 01:43:56.720
- makes one thing super clear -
01:43:56.840 --> 01:44:04.200
- which is that every economic act
is an explicitly moral choice -
01:44:04.320 --> 01:44:06.600
- because you're either solving
people's problems -
01:44:06.720 --> 01:44:10.880
- or creating more problems
than you solve, and what that does is -
01:44:11.000 --> 01:44:14.000
- it merges the moral and economic world.
01:44:22.960 --> 01:44:27.320
Doing damage to the climate,
which can cause GDP to go up -
01:44:27.440 --> 01:44:32.000
- isn't seen as a positive but as creating
a problem rather than solving a problem.
01:44:32.120 --> 01:44:36.960
Some activities are good, some bad. This
framework helps distinguish between them.
01:44:37.080 --> 01:44:41.320
Solving the climate problem in this
framework is economically positive -
01:44:41.440 --> 01:44:45.120
- because it's increasing
real human welfare.
01:44:47.280 --> 01:44:52.400
One of the things that we like best
about this common sense language -
01:44:52.520 --> 01:44:56.120
- about what prosperity is,
is it democratizes the way -
01:44:56.240 --> 01:45:00.680
- in which humans can talk
about prosperity and purpose.
01:45:02.400 --> 01:45:07.160
How do you talk around the dinner table
about whether you're increasing GDP?
01:45:07.280 --> 01:45:09.520
But you can have a great argument -
01:45:09.640 --> 01:45:13.600
- about who's solving
the most important problems in societies.
01:45:13.720 --> 01:45:18.160
And centering prosperity
in a moral context -
01:45:18.280 --> 01:45:25.000
- and one that also leads to increasing
amounts of material welfare, we think -
01:45:25.120 --> 01:45:30.080
- leads to fulfillment in the deepest way
that human societies can offer it.
01:46:11.000 --> 01:46:14.080
We didn't just hear about
a whole bunch of interesting things.
01:46:14.200 --> 01:46:17.720
Each one of us individually,
our perspective has changed.
01:46:17.840 --> 01:46:21.320
We're going to go out of this
richer than before.
01:46:21.440 --> 01:46:23.920
We all believe
we want to change the world.
01:46:24.040 --> 01:46:29.520
It's not just an empty
ivory-tower academic exercise.
01:46:29.640 --> 01:46:33.080
We don't need to bullshit to say
we're already starting -
01:46:33.200 --> 01:46:38.280
- to have a laundry list of solutions that
are important for the whole picture.
01:46:38.400 --> 01:46:42.240
What we haven't gotten yet
is the coherent story.
01:46:42.360 --> 01:46:45.520
How can our scientific ideas come out?
01:46:45.640 --> 01:46:49.720
We can do that not just by saying,
"I have one solution."
01:46:49.840 --> 01:46:53.640
That's good and dandy,
but it doesn't really matter much -
01:46:53.760 --> 01:46:56.640
- because you drown
in the noise of problems.
01:46:56.760 --> 01:47:01.680
However, if our perspective,
our world view can provide -
01:47:01.800 --> 01:47:07.000
- solutions to almost all the problems
here, then suddenly you have something -
01:47:07.120 --> 01:47:12.920
- that the previous paradigm of
how to see the world had no way of doing.
01:47:13.040 --> 01:47:15.800
Then can we hope for more?
01:47:26.280 --> 01:47:28.680
Our stock of solutions to human problems -
01:47:28.800 --> 01:47:32.120
- is a way of thinking
about the wealth of society -
01:47:32.240 --> 01:47:37.040
- and about growth as the rate at which we
are creating solutions to human problems.
01:47:37.160 --> 01:47:42.320
We must stop, so by the end of the century,
we'll be in a zero-carbon environment.
01:47:42.440 --> 01:47:46.000
We need negative emissions
if we want to fix this.
01:47:46.120 --> 01:47:49.920
Deliberative polls are the idea
of regular polls plus something.
01:47:50.040 --> 01:47:54.160
I know of nowhere in the world where
courses are taught in human history.
01:47:54.280 --> 01:47:59.280
If they want to be so big,
they must release their algorithms.
01:47:59.400 --> 01:48:02.640
Limiting the number of friends
that you can have.
01:48:02.760 --> 01:48:05.960
You cannot be both
a social network and a news feed.
01:48:06.080 --> 01:48:09.200
You have to break those
into two separate businesses -
01:48:09.320 --> 01:48:11.640
- and you can't use advertising.
01:48:11.760 --> 01:48:15.720
We have to put serendipity
back into cyberspace.
01:48:15.840 --> 01:48:20.920
We create artificial economies to see what
would happen if we tried out new ideas.
01:48:21.040 --> 01:48:26.000
Corporations would happily be regulated
as long as you regulate the competition.
01:48:26.120 --> 01:48:29.080
We should tax capital quite directly.
01:48:29.200 --> 01:48:34.520
Every economic act is
an explicitly moral choice.
01:48:35.240 --> 01:48:38.600
Optimism is going to be a tool
in solving these problems.
01:48:38.720 --> 01:48:43.800
We know what the challenge is. We have
to assume there's a good outcome.
01:48:43.920 --> 01:48:48.960
There are no guarantees. There may be
costs. That's the attitude we must have.