Hawking's final warning to humanity: superintelligence and supermen will come

"I have lived an extraordinary life on this planet while traversing the universe with my mind and the laws of physics. I've been to the farthest reaches of the galaxy, into black holes, back to the origins of time. On Earth, I've experienced highs and lows, turmoil and peace, success and pain. I've been rich, I've been poor, I've been able-bodied, I've been disabled. I've been praised and criticized, but I've never been ignored. "

-- Stephen Hawking, Brief Answers to the Big Questions

Recently, Hawking's last book, "Brief Answers to the Big Questions", was published, in which he attempts to answer several questions. Will humanity survive the challenge? Should we colonize space? Does God really exist?


Creating AI may be the last big thing in human history

The greatest mistake mankind has ever made lies in , decided that sci-fi is just sci-fi and that highly intelligent machines are unlikely to be commonplace in people's lives.

Hawking argues that when AI becomes so superior to humans in the field of AI design that it can recursively improve itself without human help, we could face an intelligence explosion that would eventually lead to machines that are far more intelligent than we are.

So far, artificial intelligence in its raw form has proven very useful, and Hawking fears that the consequences of creating something that can match or surpass humans would be unthinkable. Humans are limited by slow biological evolution and cannot compete and will be replaced. In the future, AI could develop a will of its own, one that conflicts with our will.

In the future, the emergence of superintelligence will bring about an explosive shift in humanity, and a superintelligent AI will be very good at achieving its goals, and if those goals don't align with ours, humanity is in huge trouble.

Hawking believes that the creation of artificial intelligence will be the biggest event in human history. Unfortunately, it could also be the last unless we learn how to hedge our risks.


Interstellar expansion may be the only way for humanity to save itself

Hawking believes it is inevitable that at some point in the next 1,000 years, a nuclear confrontation or environmental catastrophe will cripple the planet. At that point, Hawking hopes and believes that the intelligent race of humans will find a way to escape the crude bonds of Earth and thus survive the catastrophe.

Now it seems that tapping other planets for survival opportunities is the only option for humanity in the face of da catastrophe. Humans need to rekindle the enthusiasm of early space travel in the 1960s to explore other solar systems.

Interstellar expansion may be the only way to save humanity from itself.


New life forms of interstellar travel machines will emerge

According to relativity, it would take humans about 100,000 years to reach the center of the galaxy.

In science fiction, people overcome this difficulty by space warping or dimension traversal. But Hawking believes none of this is possible, no matter how smart life becomes.

And there may also be a simpler way in the future - almost within our reach already - to design machines that will last long enough for interstellar travel. When people reach a new star, they can land on a suitable planet and mine more material to make more machines that can be sent to more stars.

Such a machine would be a new form of life, based on mechanical and electronic components rather than macromolecules. They may eventually replace DNA-based life, just as DNA may have replaced early life.


This century has seen the emergence of gene-edited Superman

In Darwin's theory of evolution, time makes people smarter and kinder, but apparently, humans don't have that much time to wait for natural evolution, and we are entering a new phase that might be called "self-designed evolution" - where humans will be able to change and improve our DNA to improve the flaws that we show under the control of our original DNA.

Intelligence that is controlled by multiple genes will be much more difficult to improve, but it is not impossible. Hawking is certain that in this century people will discover how to modify intelligence and instincts, such as aggression. As a result, he predicted that a law against human genetic engineering might pass. But some people can't resist the temptation to improve human traits like memory size, disease resistance, and longevity.

Once people try this "temptation", there will be supermen. Once such a superhuman emerges, those humans who cannot compete with it will face major problems and they will die out or become irrelevant. Conversely, those who change their genes by self-design will improve themselves at an increasingly rapid rate.


Is there any other intelligent life in the universe?

This issue has been long debated.

Hawking suggests several possibilities.

We are used to thinking of intelligent life as the inevitable result of evolution, but it is more likely that evolution is a random process and that intelligence is only one of many possible outcomes.

Looking at the chronology of evolution, perhapsintelligence is an unlikely development for life on Earth. Because it takes a long time to go from a single cell to a multicellular organism - 2.5 billion years - which is a necessary precursor to intelligence, this is consistent with the assumption that the probability of life developing intelligence is low. In this case, we might expect to find many other forms of life in the galaxy, but it is unlikely that we will find intelligent life.

Another type of life that could not develop to the intelligent stage could be an asteroid or comet colliding with a planet. Part of the argument is that a relatively small celestial body collided with Earth about 66 million years ago, causing the extinction of the dinosaurs. Some early small mammals survived, but anything as large as humans would almost certainly have been wiped out.

There is also the possibility thatThere was a reasonable possibility that life could form and evolve into intelligent beings, but the system became unstable and the intelligent beings destroyed themselves. That would be a pessimistic conclusion, Hawking says, and I hope it's not true.

He prefers a fourth possibility. We ignore other forms of intelligent life. At our current stage, an encounter with a more advanced civilization might be a bit like when the inhabitants of the Americas met Columbus, and Hawking thinks the other will be more advanced as a result.

Stephen Hawking had many prophecies during his lifetime that were fulfilled one by one in the course of human development, what is your opinion on the conjecture of this book?



Get the template for the top Internet management methodology, The Goals and Key Results Method

reply "AI Blockchain."

Get the World's Leading 200G Artificial Intelligence Blockchain Learning Kit

reply "Nie Qian."

View the profile of Mr. Nie, expert cooperation WeChat: nq1919

1、Head of Marketing Department of Huali Microelectronics Yang Zhanti Artificial intelligence will power the semiconductor industry in 2018
2、New Apple patent revealed a selfdriving car
3、A light boat has passed ten thousand hills What Zuckerberg actually said at the F8 conference
4、This programming software is probably the first game youve ever played in the server room
5、Baidu Huawei two giants reached a comprehensive strategic cooperation to achieve humancomputer interaction wake up everything

    已推荐到看一看 和朋友分享想法
    最多200字,当前共 发送