Listen to Episode 10 here


PETER BRANNEN 

Peter Brannen is an award-winning science journalist whose work has appeared in The New York TimesThe Atlantic, The Washington PostWired, Aeon, The Boston GlobeSlate and The Guardian among other publications. He is currently writing a book about the five major mass extinctions in Earth's history to be published by Ecco, an imprint of Harper-Collins. Peter was a 2015 journalist-in-residence at Duke University's National Evolutionary Synthesis Center and a 2011 Ocean Science Journalism Fellow at the Woods Hole Oceanographic Institution in Woods Hole, MA. Peter got his start as a reporter for the Vineyard Gazette in Edgartown, MA. Peter is particularly interested in ocean science, deep time, astrobiology, the carbon cycle and the Boston Celtics.

DR. SETH BAUM 

Dr. Seth Baum is Executive Director of the Global Catastrophic Risk Institute, a nonprofit think tank that Baum co-founded in 2011. His research focuses on risk and policy analysis of catastrophes that could destroy human civilization, such as global warming, nuclear war, and runaway artificial intelligence. Baum received a Ph.D. in Geography from Pennsylvania State University and completed a post-doctoral fellowship with the Columbia University Center for Research on Environmental Decisions. His writing has appeared in the Bulletin of the Atomic Scientists, the Guardian, Scientific American, and a wide range of peer-reviewed scholarly journals. Follow him on Twitter @SethBaum and Facebook @sdbaum 


DR. ANDERS SANDBERG

Anders Sandberg’s research at the Future of Humanity Institute centres on management of low-probability high-impact risks, societal and ethical issues surrounding human enhancement, estimating the capabilities of future technologies, and very long-range futures. He is currently senior researcher in the FHI-Amlin industry collaboration on systemic risk of risk modelling. Topics of particular interest include global catastrophic risk, cognitive biases, cognitive enhancement, collective intelligence, neuroethics, and public policy. Anders has a background in computer science, neuroscience and medical engineering. He obtained his Ph.D. in computational neuroscience from Stockholm University, Sweden, for work on neural network modelling of human memory.

DR. GUY MCPHERSON 

Guy McPherson is Professor Emeritus of conservation biology at the University of Arizona. His is the leading voice on the topic of abrupt climate change leading to near-term human extinction. His classrooms were under surveillance by the United States government no later than 2005. He has been labeled an anarchist and eco-terrorist by senior members of the Obama administration. He readily pleads guilty to the former and probably also the latter, depending upon how it is defined.


MUSIC ATTRIBUTIONS

Technological 4 (Cory Gray) / CC BY-NC 3.0
Tell the Future (Cory Gray) / CC BY-NC 3.0
Bastarde (Glass_Boy) / CC BY-NC 3.0
The Deep Serene (Punkt) / CC BY-NC 4.0
O Cérebro do Morto (Dr. Frankenstein) / CC BY-NC-ND 3.0

transcript

OPENING SEQUENCE

D.S. MOSS

What would you do if an asteroid headed toward earth and we had just two weeks before impact?

I actually have a robbing fantasy too. But it's more of a heist, like stealing the pink panther diamond just to see if I could pull it off. Problem is, if everyone knew the asteroid was coming I'm it wouldn't really be all that hard to accomplish and that would, well, that would spoil the whole fun of it. 

D.S. MOSS

But, as I learned in part 1 of this episode, the risk of an extinction event caused by a super volcano, alien attack or asteroid is extremely small, at least for the next 100 years or so.  

ANDERS SANDBERG

The idea of existential risk...is something that is essentially unendurable and terminal, and affects all of humanity. Another way of putting it is that an existential risk, that's essentially the end of the story. 

D.S. MOSS

That's Anders Sandberg,a James Martin Research Fellow at the Future of Humanity Institute.

ANDERS SANDBERG

We are part of the Philosophy Department of Oxford University that is doing a kind of big picture, long-range thinking that involves, of course, thinking about existential threats to humanity.

MUSIC: "Technological 4" by cory gray

D.S. MOSS

Dr. Sandberg's expertise is in the existential risk of artificial intelligence. We'll be discussing that, along with nanotechnology, biowarfare, nuclear war and climate change.

Pack your bags cause we're going on a human guilt trip exploring the man made risks that could lead to our own extinction. Please join me for episode 10 of The Adventures of Memento Mori: Part 2 of Mass Extinction.

OPENING BUMPER

MUSIC:  "Memento mori" by Mikey ballou

Female announcer

From The Jones Story Company, this is: THE ADVENTURES OF MEMENTO MORI, A Cynic's Guide for Learning to Live by Remembering to Die - the podcast that explores mortality. Here's your host D.S. Moss.

CHAPTER 1: PANDEMICS

MUSIC: "Vistage" by glass boy

D.S. MOSS

Before we get into the man made extinction risks, there is one more natural threat worth talking about: Pandemics.

DR. SETH BAUM

For natural pandemics, the worst-case scenarios that I see in the literature have death tolls in the tens to hundreds of millions of people, which is an enormous number, but it's small relative to the total human population.

D.S. MOSS

Joining us again is Dr. Seth Baum from the Global Catastrophic Risk Institute. While natural pandemics are certainly a serious risk for a global catastrophe, meaning an infectious disease killing 1/3 of the world's population, the human race is highly adaptable and would most likely carry on.  

DR. SETH BAUM

One reason to not worry quite as much about the natural pandemics is that nature can produce nasty stuff, but the nastiest stuff is unlikely to just show up somewhere. It would need a very particular set of features, and it's unlikely that those features would just spontaneously show up.

D.S. MOSS

However...

DR. SETH BAUM

It might be more likely that they would show up if we put them there, if we designed the pathogens to behave in a certain really deadly way.

CHAPTER 2: OMNICIDAL

D.S. MOSS

And so that transitions us into the topic of this episode... 

Anders Sandberg

It seems that we humans are actually the biggest threat to our own survival. 

D.S. MOSS

The technical term is omnicide: The total extinction of the human species as a result of human action. 

ANDERS SANDBERG

The problem is, of course, we can create new kinds of threats that have never existed in nature before, or even target threats against ourselves for various ideological, religious, or various stupid reasons.

D.S. MOSS

According to a 2008 report from the Future of Humanity Institute, the probability of human extinction caused by anthropogenic means by the year 2100 is 19%.

Let me say that another way, within the next 83 years there is a 20% chance that we kill ourselves.

So, let's get on with it cause we don't have much time.

First up: Biowarfare.

CHAPTER 3: BIOWARFARE

DR. SETH BAUM

)Biological weapons, as a class of weapons, are nothing new. Humans have been poisoning each other since a long time ago.

D.S. MOSS_MONO

World War I. Probably even sooner than that, right?

DR. SETH BAUM

Millennia, maybe even more millennia. We've been hurting each other with poisons since before we even knew these were living organisms in some cases... That's not going to destroy the world. That's just another way to kill people in your immediate vicinity. What's newer is the capacity to make infectious diseases that spread from person to person and could kill a large portion of the planet.

D.S. MOSS

This is what Seth was saying just a minute ago about pandemics. They're much more likely to cause an extinction event if humans designed the pathogens to use in diabolical ways. 

DR. SETH BAUM

The good news is, nobody hardly would want to do that, because to do something that would kill everyone, that kills your side of the conflict, too.

D.S. MOSS

That's true for established governments playing king of the mountain. Powerful states like the U.S., Russia and China don't want to win world dominance by killing all of their people in the process.

North Korea, on the other hand, well, who knows?  

DR. SETH BAUM

The thing that makes us nervous is if you have groups of people who actually want to kill everyone...

D.S. MOSS

Like fanatic ideological groups or James Bond super villains. 

DR. SETH BAUM

Right now, this is not a concern. The technology is just not there.

...Give it a few decades and this might be possible for much smaller, much less well-equipped groups to pull off something like this. That makes us nervous.

D.S. MOSS

So that's Biowarfare. Not a great existential risk at the moment, but could be in the future if a nihilist villain like the Joker in The Dark Knight just wanted to see the world burn. 

Next up, Nanotechnology.

SOUND: SFX Nano.wav

CHAPTER 4: NANOTECHNOLOGY

DR. SETH BAUM

With nanotechnology, the important thing to understand... ...There are 2 types of nanotechnology. First, there are things that are basically just nano-scale materials, really small particles, stuff like that. It already exists. It's used to enhance different types of materials that can be found as, for example, an additive in sunscreen. It's nice, has some risks associated with it, but it's not going to destroy the world.

The type of nanotechnology that might be involved in destroying the world is called molecular manufacturing. It's a way of building materials by basically putting molecules together in a very precise fashion. In doing that, it's a new Industrial Revolution, if it works out as some people think it might be able to.

Concerns like the grey goo come from this latter type of molecular manufacturing.

D.S. MOSS

Grey goo, yes that's a real term, is a hypothetical scenario when self-replicating nano machines going rogue, spreading all around the planet, and eating everything in site, including us. That's it. The whole planet turns into this grey goo.

DR. SETH BAUM

We actually don't get very worried about this, because it would be pretty easy to avoid...

...With other aspects of nanotechnology, yeah, but not that.

D.S. MOSS_MONO

What aspects?

DR. SETH BAUM

Because you can make not literally anything, but a really wide range of things very easily, you could build weapons, for example, much more readily than you can today...

...The capacity for large-scale violence goes up a lot.

D.S. MOSS

As with most technological advancement in history, the research and effort behind developing molecular manufacturing is primarily driven by the military. Someone blows up your tank? No worries, we can build another one in ten minutes with microscopic machines.

I then asked Seth if he knew of efforts to develop nanotechnology for positive things.

DR. SETH BAUM

It could solve climate change, potentially... 

It's actually not that complicated how to do it. It just takes a lot of energy to pull off, and that energy is expensive. If solar panels get very cheap, and if the process of building the little boxes that you use to suck carbon dioxide out of the atmosphere get very cheap, then we might be able to pull as much of it out of the atmosphere as we want, which would basically solve the climate change problem. That's a pretty good thing.

D.S. MOSS_MONO

That's a good thing.

DR. SETH BAUM

Yeah, that's a very good thing.

D.S. MOSS_MONO

That makes me feel a lot better...

...Let's move on, then, because I'm in a good mood about this. Let's not ruin it.

SOUND: TERMINATOR.AIFF

CHAPTER 5: TECHNOLOGY / AI

D.S. MOSS

Spoiler alert, the good mood doesn't last. 

We're back with Dr. Anders Sandberg from the Future of Humanity Institute. 

ANDERS SANDBERG

People have always been worried about our own creations turning against ourselves. After all, Frankenstein is way older than the field of artificial intelligence, but most of these fears are a bit like being afraid of a predator. The monster or the terminator is like a large predator, fierce, relentless, and hungry for our blood.

D.S. MOSS

For me, when I think of the threat of Artificial Intelligence I think of a Terminator-like scenario. A futuristic scene in which a technology mega-corporation maniacally pursues the advancement of a computer or robot that will revolutionize humanity and once it does, singularity soon follows. The machines then become autonomous and smarter than humans and thus find our existence either redundant or repulsive and try to exterminate us all. 

So, as we stand at the threshold of the nanotech and artificial intelligence revolution - should we be scared of the machines taking over?

ANDERS SANDBERG

Anybody who's been programming knows that it's very hard to make software do exactly what you planned. A small slip-up can mean that the software does nothing, or that it erases your files, or does something else very unexpected. This gets even more complicated when you have software that actually learns or takes in information from the environment. It can become very unpredictable. The real problem happens, of course, if the software is smart, so it's active in the world and trying to solve problems, and you have not really specified the problem well.

D.S. MOSS

A popular analogy of this is the paper clip. You have a smart machine, and you tell it, "Make me as many paper clips as possible." Its goal becomes to maximize the number of paper clips. It makes itself much smarter, the goal is always to maximize paper clips, and then it starts to plan to convert the world into paper clips. The fact that the person programming the machine doesn't want to be turned into a paper clip doesn't matter. Making the user happy is not the goal. The goal is to make paper clips.

It's grown so smart that it predicts that the user will try to pull the plug before this happens, so one sum goal of maximizing paper clips is to make sure the user doesn't have a chance, and so on.

ANDERS SANDBERG

In reality, of course, it's unlikely to be paper clips. That's the end of humanity, but something like this could happen. You create artificial systems of great power and give them value, so at the surface it looks reasonable, make us happy, make the company rich, defend the nation. Then, of course, as they're being further analyzed by this alien, very smart intelligence, it comes to conclusions that are actually completely contrary to what we would have wished for, and implements what would be great power.

D.S. MOSS_MONO

How do you manage that that doesn't happen in this era that we live in, where it's not so much about seeing the actual use and the long-term effect of technology, but the immediate gratification of technology? How do we make sure that we keep a grasp on that?

ANDERS SANDBERG

The real problem in making a safe and beneficial artificial intelligence is not that it needs to be capable of doing something useful, but that we need to put our values into it. This is tricky, because we humans have been trying for thousands of years to specify what a good life is, what true happiness is, what human excellence is, and philosophers and artists have had modest success on that.

D.S. MOSS

As the power of Artificial Intelligence grows, so does the risk of it catastrophically going off the rails. Dr. Sandberg points out that the existential risk isn't necessarily reaching singularity and the smart machines turning evil and killing us.

The risk lies in technical failure of AI, such as unexpected behavior or programming mistakes as well as Philosophical failure where the complexity of the values being programmed aren't properly understood or defined.

For example, replace the paper clips with the molecular nanotechnology that can suck up the extra CO2 in the air. What if the parameters are wrong and these little boxes suck up too much CO2? What if they accidentally suck oxygen instead?

Out of all the extinction possibilities this is the scariest so far because it could be an "Oops, my bad." scenario.  

MUSIC: "Terms" by Glass boy

D.S. MOSS_MONO

I'm going to put you on the spot here.

ANDERS SANDBERG

Okay.

D.S. MOSS_MONO

What is the greatest anthropogenic risk to humanity at this point in time?

ANDERS SANDBERG

Nuclear war. People tend to overlook it these days... After the Cold War, when we looked at all the near misses, it was horrifyingly close in several episodes. Then after the Cold War, people kind of forgot about the missiles in their silos, but they're still there, and a nuclear war would, at the very least, be killing hundreds of millions of people, and destroying our technological infrastructure, and causing electromagnetic pulses, wiping out much of our digital network around the world. It's not implausible that it could cause a nuclear winter that would kill, literally, billions of people.

That is a risk we should remember, and the probability of a nuclear war is not that small.

D.S. MOSS

Ok, now that's the scariest scenario. So far.

CALL TO ACTION 1

MUSIC: Emergency exit by Dr. frankenstein 

FEMALE ANNOUNCER

Ever wonder what Elvis's last words were or the most outrageous methods of living forever? Discover titillating titbits about mortality by visiting "The Adventures of Memento Mori" YouTube channel and be the slightly odd yet endlessly fascinating conversationalist at your next party.

And be sure to stay up to date with the quest for enlightenment on Instagram and Twitter by following @remembertodie.

All of this, and more, can be found on our site remembertodie.com. And now, back to show...

CHAPTER 6: NUCLEAR WAR

MUSIC: "Meager" by glass boy

D.S. MOSS

Before the break Anders Sandberg from the Future of Humanity Institute was discussing the risks of Artificial Intelligence and what he thought the greatest risk for human extinction was.

ANDERS SANDBERG

Right now, nuclear war is on top of my list of the biggest danger. 

D.S. MOSS

There are currently 9 contestants in the nuclear arms race. 

In order from least to most: we have North Korea coming in at 8 nuclear warheads.

Next up is Israel with 80. 

India slides in at a 110 but Pakistan edges out of 6th place with a cool 140 of their own.

The UK is at 215, China's at 260 and France comes in a respectable 3rd place at an even 300. 

D.S. MOSS

And the runner up for who has the largest nuclear warhead inventory is.....

The United States of America with 7,100 just a hair shy of Russia at 7,300.    

DR. SETH BAUM

The question is, how to kill everyone with nuclear weapons?

D.S. MOSS

And for the record, these 15,500 combined warheads are enough to kill every human on the planet ten times over.

DR. SETH BAUM

Under pretty much any conceivable actual nuclear war scenario, most of the world is not going to get hit by nuclear weapons. Brazil, for example, they're not enemies with any of the nuclear weapons countries, and they're also not close military allies with any of them.

However, after the war, you have 2 things going on. One is that you just lost a large portion of the planet, especially a lot of the most important cities in the world. That's going to hurt the economy...

D.S. MOSS

The second thing you have is nuclear winter. This is when cities burn from nuclear weapons. It's so intense that the smoke stays up in the sky for years and blocks the sunlight, making the surface cold. 

DR. SETH BAUM

Worst-case scenario, the crops fail and people starve to death. That's everywhere in the world, regardless of where the war occurred...

...then basically the researchers debate how likely it is that you would end up in outright human extinction versus just something that's close to that. That's bad.

D.S. MOSS_MONO

That you know of, was there ever built a real doomsday machine from either side?... Is that still a hypothetical theory? There were always rumors out of Russia and the U.S. that there was a doomsday device.

DR. SETH BAUM

My understanding is that Russia or the Soviet Union built it, and that it's still up and running today...

D.S. MOSS

Except instead of the Doomsday machine it's called Dead Hand. It's a system that would automatically launch a barrage of nuclear weapons in response to certain automatically observed conditions. 

DR. SETH BAUM

The idea is that if we hit Russia and knock out their political leadership such that they can't order a launch in response, that the response would happen anyway. It would happen automatically.

D.S. MOSS_MONO

If it's automated, that means it's a program. If it's a program, then can't somebody trick it into thinking that there's an attack, and then actually the doomsday is the initiator?

DR. SETH BAUM

I sure hope not. It depends on how it's built...

...Could something like that happen with their Dead Hand doomsday machine automated launch system? I don't see why not. Let's cross our fingers.

MUSIC: "Tell the futuRe" by cory gray

D.S. MOSS

You know, in truth, I decided to do an episode on Mass Extinction because I thought it would be a bit of a lark. It was supposed to be a break from the more serious death topics. Dinosaurs, aliens, asteroids, artificial intelligence...all the fun stuff of Steven Speilberg movies. The episodes of me confronting my own death were supposed to be the uncomfortable and disturbing ones.

Well, I was wrong. Belonging to a species that creates a tool that can kill every human on the planet and then automates a kill switch on it is the most uncomfortable and disturbing topic I've yet to cover.

That is, of course, until this... 

CHAPTER 8: CLIMATE CHANGE

PETER BRANNEN

It seems like the moral of Earth's history is that when you mess with the carbon cycle, things tend to go wrong. What we're doing right now is a major experiment just injecting Carbon into the atmosphere.

D.S. MOSS

Back in part one of the episode, science journalist Peter Brannen gave me a history of the previous 5 mass extinctions and the common thread of them all was fluctuations in CO2 levels.

PETER BRANNEN

When we inject CO2 into the atmosphere, it warms the air but CO2 also reacts with sea water to make Carbonic acid and it does some other sort of arcane geo chemical things that just make it harder for things to calcify. There's Plankton that's at the bottom of the food chain that builds little shells and already we're starting to see them not be able to survive. When you knock out the bottom of the food chain that's a recipe for...

D.S. MOSS_MONO

Yeah so we're ... From a historical perspective, we are killing the bottom of the food chain at a rate similar to a previous mass extinction.

PETER BRANNEN

We're acidifying the ocean at a rate possibly faster than some of the mass extinctions.

D.S. MOSS_MONO

Faster!?

PETER BRANNEN

Yeah, so coral reefs it seems you can put a coral reef in an aquarium, you can jack up the CO2 levels and by the time you get to around 500 parts per million, we're at 400 now, we'll be at 500 pretty soon, corals can't survive anymore...

...Basically that's all gone by the end of the century. That's pretty terrifying.

...You combine that with overfishing, pollution, converting half the world's land surface to farm land. There are these changes going on the planet just in terms of Carbon Dioxide which has happened before and that's bad...

...There's some very bad trends. It's possible we could figure out how to live in concert with nature but if we keep our pedal to the metal for a couple of more centuries then we'll be up there with the worst things that ever happened.

D.S. MOSS_MONO

If I were to invest in a property, thinking into the future, what would be a good country? Canada? 

PETER BRANNEN

I had this conversation with this guy Matt Huber at University of New Hampshire who did this study where he tried to figure out if we burned it all, what becomes uninhabitable. I made a joke to him. I went to Newfoundland a couple of years ago and I was like, "Should I get real estate there?" Just straight faced he said, "Yeah it's a good place. I'm looking in New Zealand myself."

MUSIC: "Tell the future" by Cory gray

D.S. MOSS_MONO

Newfoundland.

PETER BRANNEN

Newfoundland, it's beautiful.

D.S. MOSS_MONO

Got a nice ring to it.

D.S. MOSS

If the diagnosis is that we're quickly on our way to or already in the 6th mass extinction event, What is the prognosis? 1000 years, 100 years, or less?

Stick around cause you're gonna wanna hear this.

CALL TO ACTION 2

MUSIC: "O Cerebro do Morto" by Dr. Frankenstein 

FEMALE ANNOUNCER

The Adventures of Memento Mori is an independent podcast and we could use your support. Shop with us. Go to remembertodie.com/shop and buy some merchandise. Get your entire family a "This could be my last cup of coffee" mug or be the first one on your block to sport a Mori "Death! Yo." baby tee.   

CHAPTER 9: HABITAT

MUSIC: "Bastarde" by Glass boy

D.S. MOSS_MONO

Your website, which has tons and tons of great information, the title is, "Our Days Are Numbered", can you talk to that?

guy McPherson

It seems to me that we are close to running ahead of habitat for our species on the planet. When that happens, as with every other species that runs out of habitat, we will go extinct.

I'm Guy McPherson, I'm professor emeritus at the University of Arizona, and I'm a conservation biologist.

...Conservation biologists study speciation, extinction, and habitat.

D.S. MOSS

I called Guy to discuss humanity's prognosis from a climate change point of view.

GUY MCPHERSON

The bottom line is, without habitat, no species can persist indefinitely into the future.

D.S. Moss_Mono

Right. One of the responses to our declining habitat,...one would say, or I would say, but we still have so much space? Why don't we just move or migrate to a place that we could actually survive? Why is that not true?

GUY MCPHERSON

...because there are relatively narrow conditions under which any species, including our own, can persist on the planet.

but the issue is not how much space we physically take up, it's how much space we physically require, which is a completely different thing.

D.S. MOSS

I actually never thought about it like this before. It's not a person's physical space, rather it's the space required to support their consumption.

GUY MCPHERSON

If you eat grains, and especially if you eat animals that eat grains, then by definition, you require the same 3 acres I do, or depending on how high on the food chain you eat, maybe you need 6 or 8 acres to support yourself.

D.S. MOSS

And the more you consume the more space you take up. Just because I live in an apartment in the city doesn't mean I'm not taking up a lot of space on the planet.  

D.S. MOSS_MONO

There's this idea that humans, we are so smart, that we can replicate those grains. We can grow them in a ...test tube or something, where in fact we can survive because we'll just manufacture that some other way.

GUY MCPHERSON

I watched Star Trek when I was a kid too, and it turns out...we can't go up to a panel and say, "Tea, earl grey, hot," and have this substance magically appear before us. It takes a lot of energy, for one thing, and there's a finite amount of fossil fuel, for example, on the planet.

D.S. MOSS

And as if our over consumption and addiction to fossil fuels weren't enough, Guy tells me...

GUY MCPHERSON

As it turns out, we now know based on research by Tim Garret at the University of Utah, we now know that civilization itself has a heat engine. The only way to turn off the heat associated with global warming, the increase in the global average temperature of the Earth; the only way to turn that off is to terminate civilization.

D.S. MOSS

Surely he's overreacting and just trying to scare people into action right?

GUY MCPHERSON

All that technology, for example, the "replacing" of fossil fuels with solar panels and wind turbines and so on, that still generates a lot of heat, civilization itself is the heat engine. It seems that there's no potential to escape that, to maintain civilization without the heat engine associated with it.

D.S. MOSS

Ok Doctor, what's our prognosis?

CHAPTER 10: IT'S TOO LATE

GUY MCPHERSON

Consider for example, in late June of 1989, the New York Director of the United Nations Environment Program was quoted as saying...

..."We have 10 years to fix the climate problem,"... "and after that, the situation is out of human hands."

D.S. MOSS_MONO

What you're saying is, even if we transition from fossil fuels into clean or cleaner energy, it's really not going to matter at all...?

GUY MCPHERSON

That's right.

D.S. MOSS_MONO

Do you have a projection of when the habitat is going to reach that point where human extinction will begin to happen?

GUY MCPHERSON

I used to give a date quite frequently...but I'm not going to give it anymore because it has two effects: It freaks people out, and it gives them a focus to attack the messenger without really paying much attention to the message.

D.S. MOSS_MONO

But soon, in relative ...

GUY MCPHERSON

Oh yeah, soon...

There was a wonderful episode of the HBO program, "The Newsroom" that aired...and the episode focused on climate change.

D.S. MOSS

And so I did a little Google and found it on YouTube. I'll post the clip on our show's website in case you'd like to watch. The scene is Jeff Daniels, the anchor, is interviewing the Deputy Director of the EPA about climate change. 

GUY MCPHERSON

One of the first questions was, "Okay, assume I'm the patient and you're the doctor, and we're talking about climate change. What's your prognosis? Do we have 1,000 years,2,000 years?" The guest says,...

D.S. MOSS

"A person has already been born who will die due to catastrophic collapse of the climate." 

GUY MCPHERSON

It got really quiet in the newsroom, and the young people working in the background, one of them says, "What did he just say?" Because it seems unbelievable, because it's completely inconsistent with what we read in the daily news and see on the television and hear on the radio on a regular basis. We've always heard that this would be a problem for the grandchildren, that all "environmental problems" would be something that the grandchildren need to solve.

D.S. MOSS

Guy's point and the point of the Newsroom episode is that it's already game over. We needed to act 20 years ago. The 6th mass extinction is underway and those metaphorical grandchildren that we always talk about have already been born.

CHAPTER 11: WHAT CAN WE DO?

MUSIC: "The Deep Serene" by Punkt

D.S. MOSS

I've never in my life wanted so much to be the optimist. But let's say, for the sake of humanity, that they're wrong. That the game isn't over just yet. What steps can we take - starting right now, to avoid the next extinction event?   

PETER BRANNEN

We definitely need to decarbonize our economy. Go do renewable whether that's wind, solar, nuclear and that has to happen really soon to avoid some destabilizing things from happening by the end of the century.

ANDERS SANDBERG

Well...if you look at the past disasters that have befallen humanity, pandemics and wars tend to top the list. We need to fix our medicine, of course, to become better at predicting stuff, but it also seems like we need to fix governments. We actually need to figure how to make governments more accountable and safer, because governments are very powerful tools, but they're also quite dangerous tools. Besides fixing technology, and making that safe, we might want to fix politics somehow. 

DR. SETH BAUM

The one I would point people to is reducing their consumption of meat and other livestock products. I know that's a sensitive one for a lot of people. A lot of people like their meat, but it really does make a difference. Staying away from automobiles or at least buying a more efficient car is another big one, but that's harder for a lot of people...

...Then, more broadly, in terms of what we do as a society, people can help encourage our government, encourage our society in general, shift our culture in positive ways...and have policies that take better care of the environment, that manage our militaries responsibly, develop our technologies responsibly...

There's a lot that can be done... Hopefully, some people will step up and do that. 

D.S. MOSS

So there we go - we just need to change the economy, fix politics and dramatically consume less red meat and fossil fuels. 

(sigh)

D.S. MOSS_MONO

After all of this, I think there's categories of 8 mass extinction risks that we've covered in this show so far, towards the end, these last 4 of the man-made ones have affected me in a way where it makes me I think depressed is not the word, but a little bit sad and worried that we have the capability to do so much and a lot of that is our own extinction, possibly. Just even for it to be possible saddens me. As someone who studies this for 40 hours a week, how do you cope with it just on an emotional level?

DR. SETH BAUM

There's a saying. If a single person dies, it's a tragedy. If a million people die, it's a statistic. The personal element tends to really ping the emotions much more heavily...

I think this is also a big reason why society as a whole doesn't do as much as they hypothetically should on these risks. It doesn't generate this sort of emotional response.

CHAPTER 12: PART 2 CONCLUSION

D.S. MOSS

It certainly does for me. This episode has actually bothered me quite a bit. How can humanity with so much potential and with such good people be so evil, ugly, greedy, ignorant, arrogant, cretinous, just so fucking frustrating. And the most frustrating part was that I'm a part of it. For better or worse.  

And then...It finally clicked.

GUY MCPHERSON

Even if I'm wrong, even if we're not in the midst of abrupt climate change, and we are, even if all the climate scientists are bought off by liberal people who are interested in perpetuating a hoax; even if all that is true and everything I have concluded about abrupt climate change is false, then our days are still numbered...

If you don't have long, maybe you'll think differently about the way you live...

MUSIC: "Attic" by Cory gray

I suspect most people would start living more urgently than I think they are now, and I think that's become my primary message, is, "We don't have long, so passionately pursue a life of excellence.

D.S. MOSS

It took the notion of everyone on the planet dying - the imminent end of the human race - for me to emotionally grasp that my days are numbered and how valuable each one is. 

What does passionately pursuing a life of excellence mean to me? I'm not entirely sure yet, but I can promise you it starts with changing the economy and fixing the government. 

D.S. MOSS

Thanks for joining me on another episode of The Adventures of Memento Mori. Thanks to Guy McPherson and Peter Brannen along with Dr. Seth Baum from the Global Catastrophic Risk Institute and Dr. Anders Sandberg from the Future of Humanity Institute. from the Real Life Church up in the Bronx. Check out all the 

Please visit our website remembertodie.com and follow us on social media @remembertodie. I'm D.S. Moss. Back again shortly for part 2 of Mass Extinction in The Adventures of Memento Mori. 

CLOSING BUMPER

MUSIC: End with our theme music

FEMALE ANNOUNCER

The episode was produced by Josh Heilbronner and D.S. Moss Theme music composed by Mikey Ballou. This has been a production of The Jones Story Company. Until the next time... remember to die.