IDKFA
I am Become Bilbo Baggins
"We're not gonna make it, are we? People, I mean"
This is a quote from Terminator 2: Judgement day. Young John Connor makes the remark to the T-800, who replies with "It's in your nature to destroy yourselves". I've been thinking about that scene recently and I can't shake the idea from my head that we're not going to make it and humanity might wipe itself out in my own lifetime. Let's look at the current problems facing the planet.
Climate Change
Please, no politics here. This topic has oddly become a political divide, but it really shouldn't be. Although it's true the climate does change naturally these natural changes occur over tens of thousands to millions of years due to factors like volcanic activity, changes in the Earth's orbit (Milankovitch cycles), and natural variations in greenhouse gases. However, the problem with modern climate change is that human activity is speeding up the process beyond what is natural and ecosystems cannot adapt to the change this quickly. The evidence that humans are having a negative impact is incontrovertible
This isn't the first time Co2 has caused a mass extinction event. At the end of the Triassic age a mass extinction event known as the central Atlantic magmatic province eruptions took place. These massive volcanic eruptions released trillions of tons of Co2 into the atmosphere wiping out 75% of all life. However, it should be noted that this happened over tens of thousands of years. Humanity has released around 2.5 trillion tons of Co2 since the start of the Industrial Revolution. That's roughly 100 times faster than the end-Triassic extinction event. See the problem?
The Sixth Great Extinction
Climate change and other human activities have other impacts on the planet beside it being a bit warmer. Pollution, acidification of the ocean and deforestation are all contributing to a sixth extinction event. This could have a devastating impact on humanity. For example, we're currently on the verge of a "insect apocalypse." due to the alarming declining numbers of insect species. I know most of us hate insects (I really don't like moths because they always seem to fly in my face), but their vital for pollination, providing a food source for other animals and natural pest control. Lots of studies have been done around the world (for example, a study found the UK has lost 50% of butterfly species since 1976), but whatever study you look at the decline is staggering. Flowering plants and global food crops depend on insects. Without them, we could see world wide food shortages, as well as the decline of other animals that feed off insects, which then causes a biodiversity collapse.
The Singularity.
At face value, this one seems like the stuff of Sci-fi, but it's a real threat that I feel nobody is really taking seriously. The other examples most people are aware of , so a brief overview here. The Singularity refers to a moment in time when AI becomes capable of improving itself autonomously, creating a feedback loop of ever-accelerating intelligence and technological advancement. Sounds cool, right? However, this could actually be a nightmare for humanity if we lose control. Imagine the AI is tasked with preventing global warming. It would have no inherent reason to prioritise humanity's survival unless explicitly programmed to do so, which means it's solution is to wipe out humanity. It may even come up with ways to do this humanely and in ways we wouldn't even notice until it was too late. A super intelligent AI would also have a complete understanding of human phycology and behaviours. It would know perfectly how to manipulate us in order to service it's own goals. Throw quantum computing into the mix and we have a being that is almost god-like from our current perspective.
Like I said, it sounds sci-fi, but it's not only a real possibilty, but one that could happen soon. Computer scientist and futurist Ray Kurzweil has predicated 2045, but as we're in a midst of a AI tech race it could be a little sooner than that and unless steps are put in place to protect humanity, reaching the Singularity could be humanity's last invention.
Do you think any of the above will mean we're going to wipe ourselves out within our lifetimes? Any other threats you can think of? ****EXCEPT FOR WAR - I was going to use this as an example, but it would mean bringing politics into the topic****
TLR notes - We could be fucked in our lifetimes.
This is a quote from Terminator 2: Judgement day. Young John Connor makes the remark to the T-800, who replies with "It's in your nature to destroy yourselves". I've been thinking about that scene recently and I can't shake the idea from my head that we're not going to make it and humanity might wipe itself out in my own lifetime. Let's look at the current problems facing the planet.
Climate Change
Please, no politics here. This topic has oddly become a political divide, but it really shouldn't be. Although it's true the climate does change naturally these natural changes occur over tens of thousands to millions of years due to factors like volcanic activity, changes in the Earth's orbit (Milankovitch cycles), and natural variations in greenhouse gases. However, the problem with modern climate change is that human activity is speeding up the process beyond what is natural and ecosystems cannot adapt to the change this quickly. The evidence that humans are having a negative impact is incontrovertible
This isn't the first time Co2 has caused a mass extinction event. At the end of the Triassic age a mass extinction event known as the central Atlantic magmatic province eruptions took place. These massive volcanic eruptions released trillions of tons of Co2 into the atmosphere wiping out 75% of all life. However, it should be noted that this happened over tens of thousands of years. Humanity has released around 2.5 trillion tons of Co2 since the start of the Industrial Revolution. That's roughly 100 times faster than the end-Triassic extinction event. See the problem?
The Sixth Great Extinction
Climate change and other human activities have other impacts on the planet beside it being a bit warmer. Pollution, acidification of the ocean and deforestation are all contributing to a sixth extinction event. This could have a devastating impact on humanity. For example, we're currently on the verge of a "insect apocalypse." due to the alarming declining numbers of insect species. I know most of us hate insects (I really don't like moths because they always seem to fly in my face), but their vital for pollination, providing a food source for other animals and natural pest control. Lots of studies have been done around the world (for example, a study found the UK has lost 50% of butterfly species since 1976), but whatever study you look at the decline is staggering. Flowering plants and global food crops depend on insects. Without them, we could see world wide food shortages, as well as the decline of other animals that feed off insects, which then causes a biodiversity collapse.
The Singularity.
At face value, this one seems like the stuff of Sci-fi, but it's a real threat that I feel nobody is really taking seriously. The other examples most people are aware of , so a brief overview here. The Singularity refers to a moment in time when AI becomes capable of improving itself autonomously, creating a feedback loop of ever-accelerating intelligence and technological advancement. Sounds cool, right? However, this could actually be a nightmare for humanity if we lose control. Imagine the AI is tasked with preventing global warming. It would have no inherent reason to prioritise humanity's survival unless explicitly programmed to do so, which means it's solution is to wipe out humanity. It may even come up with ways to do this humanely and in ways we wouldn't even notice until it was too late. A super intelligent AI would also have a complete understanding of human phycology and behaviours. It would know perfectly how to manipulate us in order to service it's own goals. Throw quantum computing into the mix and we have a being that is almost god-like from our current perspective.
Like I said, it sounds sci-fi, but it's not only a real possibilty, but one that could happen soon. Computer scientist and futurist Ray Kurzweil has predicated 2045, but as we're in a midst of a AI tech race it could be a little sooner than that and unless steps are put in place to protect humanity, reaching the Singularity could be humanity's last invention.
Do you think any of the above will mean we're going to wipe ourselves out within our lifetimes? Any other threats you can think of? ****EXCEPT FOR WAR - I was going to use this as an example, but it would mean bringing politics into the topic****
TLR notes - We could be fucked in our lifetimes.